OPINION

November/December 2003

The Nature of Earthquake Prediction

Earthquake prediction is inherently statistical. Although some people continue to think of earthquake prediction as the specification of the time, place, and magnitude of a future earthquake, it has been clear for at least two decades that this is an unrealistic and unreasonable definition. The reality is that earthquake prediction starts from long-term forecasts of place and magnitude, with very approximate time constraints, and progresses, at least in principle, to a gradual narrowing of the time window as data and understanding permit. The analogy to catching a rabbit in an overgrown confined field may be appropriate. You do not just start out looking for the rabbit, you instead build a fence dividing the field in two and then decide which half the rabbit is in, thereby gaining one bit of information. You iterate this process until you have located the rabbit "close enough for practical purposes." This is approximately how earthquake prediction proceeds in the real world, with time and position along a fault comprising the two dimensions of the search. (I assume here that we are considering for the moment only large earthquakes, that is those capable of inflicting serious damage on a regional scale; in California this means events of about M 6.7 and larger.) This more realistic perspective on the problem lays to rest the "red herring" that "earthquake predictions might do more harm than earthquakes." These imaginary concerns are predicated on the fantasy of a prediction that precisely specifies time, place, and magnitude; in the real world a progression of probabilities that narrows the space-time window in small steps clearly carries no such threat.

In the last few decades, long-term forecasts have been produced for a few well characterized fault systems in California, Japan, Turkey, and a few other locations. Tightly focused monitoring experiments aimed at short-term prediction are underway in Parkfield, California and in the Tokai region in Japan. Since the Kobe earthquake, the Japanese closely monitor their extensive real-time GPS and seismic networks, and have in effect expanded their prediction efforts to include the entire country. The recent installation of down-hole seismic and strain arrays along the Hayward Fault, combined with a growing number of continuous GPS installations, have created a de facto prediction experiment east of San Francisco Bay as well. Only time will tell how much progress such dense arrays and intensive monitoring efforts make possible on short-term prediction. In addition to their inherently statistical nature, short-term earthquake predictions differ from traditional geophysical problems in that they must be issued under extreme time pressure, with inadequate information, and with human lives at stake; in these respects they resemble the decisions made by doctors or generals more than most scientific judgments. This is not to say that the highest standards of scientific judgment do not apply, but rather that operational earthquake prediction brings additional demands.

In medicine and military science, although there is an emphasis on theory and academic training, there is a clear understanding that there is no substitute for actual practice. No one, when seriously ill, will go to a doctor who has only theoretical knowledge of medicine, and military history is replete with stories of the disasters that befell textbook generals. Medicine and war are the provinces of practitioners, those who combine theoretical knowledge with a lifetime of experience and make real decisions with inadequate data, with human lives at stake. These are realms in which it is not honorable to respond in a critical situation with "I don't know." When you take a seriously ill child to a doctor, you don't care whether she knows precisely what to do or not. With a child's life at stake you expect her to do her best--that's what she was trained for, that's what you pay her for, that is her responsibility. She is the expert whose entire training and experience have prepared her to make a tough call, with a human life at stake, with incomplete information. I believe earthquake prediction is in a similar category and that seismologists bear a similar responsibility.

At Parkfield over the last two decades a great deal of experience has been gained with dense instrumentation networks, high-quality data sets, and round-the-clock monitoring. The "official earthquake prediction" issued for Parkfield in 1985 made it possible to work with State of California and local officials to pass legislation defining spheres of responsibility and limits on liability when earthquake predictions, warnings, notifications, etc. are issued. A chain of command was established with the California Office of Emergency Services, and working relationships established between the scientists responsible for the experiment and the officials responsible for initiating responses at the state and local level. Even the "false alarms" that have resulted in public warnings at Parkfield have provided the public and emergency response personnel with opportunities to practice their response procedures and plans. Interestingly, there have to my knowledge been no complaints from the public about false alarms, and some people have stated clearly that they understand that the scientists are just "feeling their way", and they expect them to continue to issue warnings when they feel they have significant information that an earthquake might be imminent.

In addition, high-quality data from the dense arrays of instruments at Parkfield, combined with a focused research program, have resulted in significant progress on a variety of fundamental scientific problems, including Tom McEvilly and his student's work on repeating "nano-earthquakes" within a detailed 3D velocity model, a variety of "stress transfer" studies, and the recording of creep events on Middle Mountain on a variety of instruments. The Parkfield "instrument cluster" has provided a prototype for the soon to be installed Plate Boundary Observatory (PBO) clusters, and an essential shake-down cruise for dense instrument clusters elsewhere. The detailed understanding of the tectonic environment at Parkfield that has grown out of this work provides the framework for the deep-drilling effort that will continue there next spring. The San Andreas Fault Observatory at Depth (SAFOD) includes drilling a hole through the fault at 3-4 km depth, with hopes of installing instruments directly within the actively deforming fault zone.

Elsewhere in California limited steps in the direction of operational earthquake prediction have also occurred, building on the progress and procedures developed for Parkfield. In the San Francisco Bay area (SFBA) the relative success of the long-term forecast of the Loma Prieta earthquake (M 6.9, 18 October 1989), coupled with the short-term public warnings in June 1988 and August 1989 that followed the two M 5 Lake Elsman earthquakes, contributed to a perception that scientists were beginning to have some long- to medium-term prediction capability. Had Tony Fraser-Smith's ultralow-frequency E&M anomaly been available in real-time, it is entirely possible that a short-term prediction would have been issued in the first weeks of October 1989. This would have been widely viewed as a successful earthquake prediction, whatever concerns we may have harbored concerning our lack of experience with, or basic understanding of, such signals. Again, to my knowledge there were no complaints about false alarms; rather the Emergency Services Manager for Oakland has been quoted as saying that, following the warning in August 1989, two months before the main event, "all ... departments ran drills to prepare for an imminent earthquake. This made a tremendous difference in the city's response when the [Loma Prieta] earthquake struck."

The public is also coming to understand more fully the very serious threat posed by the major plate-boundary faults in California, thanks in large part to the series of long-term forecasts that have been issued since the early 1980's. Many of them are aware that most scientists agree that there is a 60%-80% chance of an earthquake of about M 7 or larger in the SFBA in the next 30 years, with comparable, or slightly lower, probabilities in southern California. Many have come to understand that it is their responsibility to take action to mitigate the risk to their families, their homes, their workplaces, and their communities. Many members of the press and public also realize that when felt earthquakes occur, or news of changes in the pattern of data on geophysical instruments leaks out, that these might portend an imminent earthquake. Quite reasonably, at such times the press and the public want to know how concerned scientists are about this possibility, and since they pay the scientists salaries to work on this question, they reasonably expect an unhedged answer. Like it or not, when seismologists answer those questions, they are practicing earthquake prediction.

The real question is whether the earthquake research community will go ahead with installing the instrumentation and monitoring systems and make the commitment to a focused research program that increases the likelihood of providing some form of useful warning before the next catastrophic earthquake in California, irrespective of whether or not they "know how to predict earthquakes." The recently funded PBO will provide a series of Parkfield-like instrument clusters along the major plate-boundary faults in California; in conjunction with the ongoing improvements in GPS and seismic networks, the entire state will in effect become an earthquake prediction experiment in the coming decade. With an increasingly concerned and sophisticated press and public to work with, I believe scientists could begin to seriously explore the realities of operational earthquake prediction throughout the areas of the state most at risk. In view of the manifold uncertainties, however, what good is likely to be accomplished?

It is my opinion that the direct and indirect benefits of a clear public commitment to an all-out effort to predict, that is further narrow the space-time window within which the next large earthquake in California is expected to occur, would far outweigh the risks.

  1. Such a commitment would provide a highly visible symbol of society's determination to face up to the earthquake threat. It would provide an incomparable educational vehicle for further informing the public about the earthquake problem, and a natural means of reminding them occasionally of the need to sustain their remediation efforts. The reality is that press stories concerning efforts to predict earthquakes are front-page news and reach many people; very few other earthquake stories receive comparable coverage until after an earthquake has occurred.
  2. Further refinement of the long- to intermediate-term probabilities would help engineers and politicians set priorities for building and infrastructure upgrades. This ongoing dialog would encourage scientists, engineers, the press, public officials, and the public to work together on earthquake preparedness and hazard mitigation.
  3. It would provide a much-needed stimulus to earthquake science in university and government laboratories.
  4. Scientists might just get lucky and save some lives. We came close in 1989; with the much better data and somewhat greater understanding available today, it is unduly pessimistic to assume nothing in this direction is possible. It's not a question of knowing what will come of such an effort, but rather being willing to try in the face of the uncertainties.

As instrumentation improves along the faults adjacent to the major cities in California, there will be a rapidly increasing amount of high-quality data potentially relevant to the question of earthquake prediction. When there is an anomaly on several instruments, or an M 5 on a significant fault raises the specter of a foreshock, information about these concerns will reach the press and the public; the reality of being in the earthquake prediction business, like it or not, will be at hand. The question is whether the scientists on the front line will be ahead of the curve and go to the press with clear, unhedged evaluations of what is going on, or whether they will be hiding, hoping that nothing happens, reacting only when they are forced to by press inquiries.

Currently it is extremely unfashionable to be seen as working on earthquake prediction, to the point of making jokes about not including any such phrases in a proposal if you wish to have any hope of its being funded. It is admittedly difficult to convince scientists to make a serious research commitment to such a difficult problem, fraught with hazards, with absolutely no assurance of success. But no matter how difficult the problem may be, there is little doubt but that the people who fund earthquake science--that is the taxpayers of this country whose lives, homes, families, and jobs are at risk, and their elected representatives who appropriate the money--believe they are paying for a serious effort to provide better information about when and where the next large earthquake will strike. The public may have a clearer view of the subject in this case than many scientists; the public seems to understand that the "effort to predict" is the essential core of any healthy scientific discipline--without it, it may be little more than collecting flowers.

Allan G. Lindh
U.S. Geological Survey
Menlo Park, CA 94025
aglindh@cruzio.com
4 September 2003


To send a letter to the editor regarding this opinion or to write your own opinion, contact the SRL editor by e-mail.

 

HOME

Posted: 21 June 2005