January/February 2000


Seismic networks are complicated collections of hardware, software, people, and personalities. The Alaska Seismic Network (ASN) is actually a combination of the resources from the state of Alaska, the USGS, and the NOAA Tsunami Warning Center. We collectively operate and maintain nearly 350 stations for monitoring earthquakes and volcanoes in Alaska. Both the Alaska Earthquake Information Center and the Alaska Volcano Observatory are cooperative agencies involving the state of Alaska, the University of Alaska at Fairbanks, and the USGS through two MOU's. We all have our own responsibilities, but we all agree on the concept that our seismic stations should be considered multiple-use observatories. We have a long understanding, and implementation, of cooperative sharing of data in real time between agencies. In this opinion piece, I will use a description of the ASN developments over the past few years to reach the conclusion that seismic networks are NOT all the same, that it is good to share both data and technology, and that a "one size fits all" philosophy for requiring common software at all regional networks is inappropriate.

When I accepted the position of State Seismologist for Alaska five years ago, I was presented with the interesting and challenging task of modernizing this regional seismic network for detection and location of tectonic earthquakes. This modernization was to have two parts: (1) upgrade the network to a hybrid system consisting of a mix of current analog short-period single-component stations, and high-dynamic-range, digital, 3-component broadband stations; and (2) introduce real-time processing with a relational database system into the seismology laboratory at the Geophysical Institute, University of Alaska, Fairbanks. This was no easy task. However, it was and still is both interesting and challenging. The approach we adopted, which is still the current concept, is to keep abreast of what kinds of systems are available, and pick and choose from the available suites of software those pieces that best solve the specific problems at hand. Further, because no complete package for our situation existed, there was also the need to develop a kind of software glue to put the pieces together and augment with our own software modules where deficiencies existed. (A more complete description of the ASN processing system will be submitted soon to BSSA or SRL by K. Lindquist and R. Hansen.)

Complications to our efforts for real-time processing and sensor upgrades come from the nature of our unique environment as compared to the rest of the U.S. The fact that Alaska has a lower population density than elsewhere means that there is much less infrastructure for travel and data communications. Roads are few, and a large number of Alaskan communities are unreachable by road. Phone communications are limited, and the footprints of satellites such as used by the USNSN typically do not cover much of Alaska or the Aleutian Islands. Also, by being in an arctic environment, the lack of infrastructure is complicated by the harsh weather conditions, lack of sunlight during winter months, and rugged terrain with large animals.

The low population density doesn't mean, however, that it is unimportant to provide seismic network coverage in Alaska. Alaska has provided, during the instrumentally recorded history of the world's seismicity, several of the largest subduction zone earthquakes on record (1964, 1957, 1965, ...). It is an astonishing fact that of the two dozen or so magnitude 8 and larger earthquakes during the last century that occurred in Alaska, virtually none of these major events has been recorded on-scale by instruments near the rupture zone. Therefore, data revealing the physics of the seismic rupture process of the Alaska-style giant subduction earthquakes are nonexistent. Detailed rupture dynamics, stress, strain, slip distributions, slip velocities, and even simply on-scale strong-motion records will not be available until rupture of this type is monitored from nearby.

From a hazards point of view, Anchorage is a major metropolitan city with a population greater than 300,000. The Trans-Alaskan Pipeline, crossing much of the seismically active areas of the state, supplies a large proportion of U.S. oil and is at risk from a variety of potential earthquake sources, including the Denali Fault (the largest intraplate fault in the country). The largest fishing industry in the country is located in the Aleutian Islands and is susceptible not only to large earthquake hazards but also to volcanic eruptions and to very destructive tsunamis from both local and distant earthquake sources. It can also be noted that Alaska is the major exporter of tsunamis that impact other areas of the U.S. and Pacific rim countries.

Field System

Our requirements for a fieldworthy seismic system are extremely varied. Seismic station installations range from 425-foot-deep boreholes and dedicated buildings with temperature-isolated basement piers (as at the IRIS site COLA in Fairbanks) to metal culverts placed in the jagged rock outcrops of Waxell Ridge in the middle of the Bering Glacier, to burying L4C seismometers in the cones of active volcanoes in the Aleutian Islands. Likewise, there exists a range of techniques for powering and telemetering data from the various sites. Since we are so far north, solar panels become a problem in the winter time. We must make use of helicopters to go where no roads exist, and install very low-power stations that utilize solar power when available and various batteries (particularly air cells) for the few months when solar gain is minimal.

We have now begun a concerted effort to place a backbone network of broadband stations throughout the state of Alaska. We are accomplishing this through a number of programs. Through the Princeton Earth Physics Project (PEPP) ten Guralp PEPP instruments have been distributed to high schools throughout Alaska. One of the requirements for a school to receive a station was for it to have a continuous Internet connection. We currently have set up four systems that telemeter 20 sps broadband vertical data continuously into our network over the Internet. The data are incorporated and processed as part of the regional network and will contribute to a "virtual network" to be used by the schools themselves. In this case, we have both power and free telemetry, but we compromise with noisy installations.

Second, we are participating in the National Tsunami Hazard Mitigation Program represented by the five western states, NOAA, USGS, and FEMA. As part of the warning aspect of this program, we are augmenting the regional seismic networks within the five western states with 6-component digital seismic stations (a 3-component broadband seismometer and a 3-component strong-motion sensor.) In Alaska we are trying to distribute the stations fairly evenly throughout the state, including the Aleutian Islands. This should result in about a half dozen stations in the Aleutians that can stay on scale during a megathrust event and another two and a half dozen stations spread throughout mainland Alaska (Note: We recognize this is still an extremely sparse distribution of stations for a state the size of Alaska, but at least it is a start.) All data associated with this program are to be shared with the tsunami warning centers and with any other participants with interest in the data.

Finally, with modest state and USGS support we have managed to install a few other broadband sites within the network. These are located near Mt. McKinley, the Anchorage area, and St. Augustine Volcano. These sites have utilized a combination of spread spectrum radio modems, commercial phone systems, intranet communications, solar panels, batteries, and wind generators. It should be noted that the various broadband stations that we can access in real time use a variety of different digitizers, formats, and protocols. This has impacted our need for developing specific software for our data acquisition modules.

Processing System

We began our work on the processing system about five years ago. At that time, we joined two software projects to create a near-real-time earthquake monitoring system that writes directly to a relational database. The Earthworm system acquires the seismic data from an analog network, and detects and locates seismic events. The Datascope software package developed by the IRIS JSPC combines a relational database system for seismic data with a host of processing and analysis tools. The combination resulted in a powerful tool for quickly analyzing and responding to seismic events. Events can be interactively watched, mapped, and reanalyzed immediately as they are posted by the Earthworm event associator. Our relational database of picks, hypocenters, and waveforms allows straightforward additions of other postprocessing tools, such as alternate location algorithms, spectral estimates and spectrograms, alternative associators, and a variety of magnitude-estimation routines, as well as examination and transport of the results across the Internet.

We have written alarm, display, and response scripts that take advantage of these RDBMS capabilities. Along with paging capabilities, a program displays a map of a monitored region. Within minutes of the occurrence of earthquakes, symbols appear on the map to denote their locations, depths, and magnitudes. Clicking on a symbol brings up a response menu that allows several actions: creation of a subset database for the event; graphical display of the corresponding waveforms and editing of the automatic picks; entry into an analyst-review station for relocation of the event; and issuing information releases via e-mail, fax, phone, and Web.

While I can paint this nice picture of a system that has been termed Iceworm (as compared to Earthworm, or the University of Washington's version known as Sunworm), it was not without its limitations. The main limitations early on had to do with the ability to expand the system to include data from other sources. Such sources would be the IRIS stations within Alaska that we wished to process within our network, or the addition of the new broadband digital stations from a mix of digitizers with different formats. We also have a need and desire to collect data from sources in geographically remote areas of the state and to transmit the collected data efficiently to our central processing site. This gave rise to another effort for system expansion. New formats were defined, telecommunications subsystems were developed, and, most importantly, another major software component was introduced. The Antelope Environmental Monitoring System from Boulder Real Time Technologies provides a new robust platform for near-real-time processing of data, data archiving, analyst review capability again using the Datascope system, and data exchange and long-haul telemetry within the University and between UAF and outside institutions via Internet protocols. The Antelope system brought a waveform data exchange capability where none existed within Earthworm and surpassed the similar types of subsystems developed at UAF in terms of ease of use, flexibility, and robustness.

We have now completely integrated, again with a software glue, a system that can utilize modules from several suites of software systems. I agree with the opinion of Steve Malone (SRL, July/August 1999) that the two efforts are actually not mutually exclusive, and in fact Alaska has taken advantage of the "complementarity" Steve describes. However, I would never have considered the development of Antelope to have brought on an "Earthworm/Antelope fiasco." Instead, I now see a much larger toolbox from which to put together a system that may be specific to the needs of a particular region, as I have been describing above. Furthermore, opinions on this type of software development are much less useful than the inherent technical issues that are often glossed over. As suggested by Steve, "neither system is the Messiah nor the devil to seismic processing" may well be true, but the phrase "the devil is in the details" is more to the point. We have found through testing and use of the various systems, and from a software engineering point of view, that the technical issues are often what dictate which particular solution is appropriate. At this time, we continue to use a mixture of several systems from several places (e.g., BRTT, Kinemetrics, Guralp, REFTEK, UAF, USGS, AFTAC, NORSAR). (Although I will say that the more complete Antelope system has emerged as the "process driver" with its strong and robust Real_Time_Executive [rtexec program] capability for managing the operation of all the modules.) So rather than being in an Earthworm "camp" or an Antelope "camp", let's be thankful that there are several bright people developing useful software products. It is most important that science can flourish through a healthy exchange of ideas and data. We will not accomplish our network goals through limiting ourselves to a particular "camp" but rather by continuing to develop our regional centers with the idea of common protocols and formats without necessarily running the same subroutines. Such capability for innovation and variation is indeed the basis for progress.

Virtual Seismic Network

My last comment has to do with exchanging seismic data between regional networks, and actually going one step farther by making data available to anyone via near-real-time delivery over the Internet. Let me expand on my example stated above that data from the stations being installed as part of our tsunami initiative would be made available to all participants in this initiative. This kind of network-to-network exchange makes sense and should be adopted universally for all reasonable data requests whether or not they are participating in a particular network expansion effort. We participate in such exchanges with the Tsunami Warning Centers, USGS/NEIC, USGS/AVO, and UCSD, with a variety of different software solutions. IRIS has given us a couple of models for sharing seismic data within the framework of proprietary concepts and fairness. The PASSCAL experiments necessarily are difficult to field and come from a well proposed and reviewed process, and allowing the data to be proprietary for a period of time seems the most fair. However, other programs within IRIS, notably the JSP and the Broadband Array, have required that data (and particularly near-real-time data) be made available immediately. This has also always seemed fair to me, based upon the circumstances under which the experiments were planned and funded. In a similar manner, it seems that seismic networks that are funded by public agencies for the purposes of regional and national monitoring should be made available to responsible scientists in a timely fashion. Given the software "toolboxes" available to us now, it is clear that a researcher within a university that has no seismic network could design an experiment and begin to acquire seismic data pertaining to his research goals from a variety of regional or global network sources and essentially construct a virtual seismic network (VSN).

We have experimented with this concept since 1995 beginning with data-exchange experiments with Hokkaido University. Later, a group of us presented a poster at the 1998 AGU on such a "virtual" network for global monitoring. I mentioned earlier the educational possibilities of a virtual PEPP network, and SeisNet, a VSN system from the University of Bergen, has recently been published in SRL on this concept. The only caveat that would of necessity be required is that dataflow from a central collection center not be harmed by the process of supplying data (i.e., undue loads on computer facilities, people, or overloading of internet bandwidth, etc.).

Finally, I reiterate that concepts of cooperation, collaboration, and data sharing need to be fostered throughout our scientific community with a focus on acceptable and appropriate formats and protocols. I am enthusiastic to see a proposed National Seismic System that is structured with a concept of regional centers that can have the authority to address particular problems pertaining to their areas of interest in appropriate and learned ways. This allows for focused systems development rather than an ad hoc engineering-by-committee approach. Remember, not all regional networks are the same, and a "one size fits all" network-processing strategy is not necessarily an approach that allows for innovation and progress.

Roger Hansen
Alaska State Seismologist
Geophysical Institute
University of Alaska at Fairbanks

To send a letter to the editor regarding this opinion or to write your own opinion, contact Editor John Ebel by email or telephone him at (617) 552-8300.

Posted: 24 February 2000