Terrascope Could Shake Up Future Earthquake Science

On October 1, 1987, the Los Angeles region experienced a strong and damaging earthquake of magnitude 6, followed four days later by an aftershock of magnitude 5.5 that caused further damage. The usual fears and uncertainties about earthquakes were heightened by a disturbing lack of sound, scientifically based information about the event in the minutes, hours, and days following the main shock. This lack of information was especially disturbing to seismologists, who realize that the technolo

Don Anderson
Oct 2, 1988

On October 1, 1987, the Los Angeles region experienced a strong and damaging earthquake of magnitude 6, followed four days later by an aftershock of magnitude 5.5 that caused further damage. The usual fears and uncertainties about earthquakes were heightened by a disturbing lack of sound, scientifically based information about the event in the minutes, hours, and days following the main shock.

This lack of information was especially disturbing to seismologists, who realize that the technology now exists to provide real-time assessments of earthquake ground motions during the critical moments after a major shock. That technology—a combination of sensors, satellites, and computers— has only to be put in place.

The California Institute of Technology has, therefore, proposed the construction of an advanced geophysical observatory— or terrestrial telescope—called the “Terrascope,” which will be able to supply seismic data of unprecedented quality during major earthquakes in California and around the world.

The Terrascope will ultimately consist of an array of at least 10 broad-band, high-dynamic-range digital seismometers that will be placed around Southern California, each interlinked by satellite telemetry and serviced by high-speed computers. Each station will also be equipped with receivers for the Global Positioning Satellite (GPS) network, which will allow precise determinations of a station’s exact location, so that slower ground movements (of a few centimeters per year) between earthquakes can be measured.

When completed, the Terrascope will be the largest continually operating scientific instrument in the world—covering an area bounded by San Luis Obispo on the north, the Mexican border on the south, the Channel Islands on the west, and the California-Nevada border on the east.

The Terrascope is basically designed as a research tool for understanding earthquakes and for studying the physics of the Earth’s interior. But because it will be located in a populous, earthquake-prone area, it will provide the gen eral public with useful and timely data about earthquake and tectonic motions. It is a project in which scientists interests and those of the public converge.

The Terrascope will complement the current 250-station Southern California seismographic high-frequency network, operated jointly by Caltech and the U.S. Geological Survey. This network of analog and digital sensors helped locate the source of the seismic activity of October 1987, but, because of the size and proximity of the earthquake, the instruments of the entire network were driven off scale, which made it impossible to report a magnitude or a time-history of the faulting. The police and fire forces, quite understandably, besieged Caltech’s Seismological Laboratory with telephone inquiries immediately after the shock, but for 20 minutes we could tell them virtually nothing, and even after that period we could identify only the location of the shock’s epicenter—nothing regarding its magnitude and damage potential.

It subsequently turned out, as was also the case for the 1971 San Fernando earthquake, that the main destructive force of the earthquake was not at its epicenter, owing to the nonvertical orientation of the fault plane upon which the rupture occurred. Had the fault orientation and rupture direction been known shortly following the event, emergency forces could have been directed more intelligently. Also, if we had had a computer bank of records of earlier earthquakes of the same type or in the same area, valuable comparative information could have been immediately recovered and disseminated.

Recent advances in seismology together with new computer and communication technology, have brought about exciting possibilities in seismic detection, making the moment for the Terrascope opportune.

1. Several methods have been developed to determine the geometry of the causative fault from the detailed study of seismograms. The geometry of the fault can often be used to associate the earthquake with a particular known fault. Knowing which fault caused the earthquake is often critically important for any post-earthquake measures. The array of sensors provided by the Terrascope will make it possible to obtain high-resolution “images” of the rupture process and fault geometry.

2. Recent studies have established that the frequency spectrum of the source (strength of the source at different wavelengths) is very different for different earthquakes. The source spectrum represents the strength of the fault. If a stronger fault fails, the resulting earthquake is generally more damaging. Knowing the mechanical nature of the fault is important in assessing the damage potential of earthquakes originating from the same fault zone immediately after the main shock, as well as in the future. Here the Terrascope can help, too.

3. We know that an earthquake initiates from a point and the rupture spreads out. Determining the rupture direction is important because, in some earthquake sequences, the initial earthquake has triggered secondary earthquakes (sometimes larger than the first) in the direction of the rupture. Detection of this phenomenon requires an array of sensors, such as those in the Terrascope design.

4. The spectrum of strong ground motions varies dramatically depending on the propagation path from the source to the site and on ground conditions at the site. In order to assess the effect of ground motions on structures at a specific site, it is extremely important to determine the response of the site to ground motions.

With the Terrascope’s broad-band, high-dynamic-range system, we can use records from small to moderate earthquakes to determine the site response that might characterize a still larger event. This is particularly important for the Los Angeles basin, where thick alluvial deposits are common. Although real-time capability is not essential for these types of studies, real-time data retrieval would make station maintenance and data archival much simpler.

Moreover, the GPS system will monitor the buildup of strain on individual faults, as well as the transfer of strain among faults resulting from catastrophic failure on a given fault. Relative positions to precisions of 10^-7 (1 centimeter over distances of 100 kilometers) are becoming routine, with precisions of 1O-^8 achievable. The GPS system will give such high-precision surveys continuously, eliminating the problem associated with long-time intervals between measurements and covering regions of the strain spectrum never before investigated.

The first element of the Terrascope has been installed at the original site of the Caltech Seismological Laboratory in Pasadena’s San Rafael Hills. It is the prototype instrument for the other nine sensors in the 10-sensor network and was developed and installed in cooperation with Harvard University, the University of Southern California, and the U.S. Geological Survey. In addition to the 10 stations, a number of portable instruments—perhaps 10 more stations and six GPS receivers— are desired, so that some studies requiring closer spacing of the instruments can be undertaken as needed. The portable instruments could be “plugged into” the permanent network, using the same type of telemetry.

The real-time observation made possible by the Terrascope array, coupled with the ability to compare contemporary earthquakes with old ones, should make for an instrument without parallel.

I have left unaddressed the question of earthquake prediction—a subject in which seismologists hold widely differing opinions. Some optimists believe that short-term earthquake prediction (in a form that is helpful to individual citizens) is possible in the foreseeable future. Those involved in the Terrascope project feel that there is no scientific basis at present to suggest that such prediction will be generally possible in the near future. Even if some sort of prediction of earthquakes were eventually developed, we feel it will be subject to large uncertainties.

Given these uncertainties, the most effective way currently to reduce seismic hazard is to interpret seismic waves as quantitatively as possible so that we can make reasonably accurate estimates of ground motions at various sites from earthquakes of different sizes at different locations. The broad-band and high-dynamic-range network of the Terrascope will be crit- ically important in obtaining such estimates.

Don L. Anderson is professor of geophysics and director of Coltech’s Seismological Laboratory in Pasadena, Calif.