OPINION

May/June 2010

Extending the Usefulness of Seismic Hazard Studies

doi:10.1785/gssrl.81.3.423

On 12 January 2010, an M ≈ 7.0 earthquake hit Port-au- Prince, the capital of Haiti, killing more than 200,000 people and resulting in one of the deadliest earthquakes in history. Since Haiti is among the poorest countries in the world, this catastrophe was easily traced to the inadequate building quality and the absence of modern geophysical institutions capable of assisting an equally modern civil protection organization. This view appears naïve and, at least in part, wrong.

On 6 April 2009, an M ≈ 6 earthquake brought heavy destruction to the city of l’Aquila, central Italy, and its surroundings, with a death toll of 308. Quite fortunately, the disaster was in this case much smaller, but not because Italy is among the elite G8 group of nations, nor because it has a long tradition in seismology, including a number of renowned universities and the richly funded Istituto Nazionale di Geofisica and Vulcanologia, which runs a technologically advanced nationwide seismic network. Nor was it because Italy has a uniquely efficient Dipartimento di Protezione Civile, capable of flawlessly accommodating the invasion—peaceful but monstrous in size—of 4 million people in Rome at the funeral of Pope John Paul II in 2005. No, the reason that the l’Aquila earthquake resulted in a lesser catastrophe than Port-au- Prince’s is essentially because a smaller earthquake hit an area with fewer inhabitants.

In fact, as is true of any other destructive earthquake, both events shook the scientific community no less than the ground, dramatically reviving the words of Ari Ben-Menahem (1995): “1992 14 April. Unpredicted earthquake of magnitude 6 in the heart of Europe, amidst hundreds of seismographs, computers, and professors of seismology. Just another reminder that even with all the accumulated seismological lore since the Lisbon earthquake (1755), we are still as surprised by earthquakes now as we were then.”

Predicting earthquakes is an old dream of mankind. Yet the more science learns about earthquakes, the more we realize that the physical processes running them are so nonlinear and sensitive to virtually immeasurable boundary and initial conditions that attempting to predict the next destructive event is much like attempting to predict the outcome of a spin of the roulette wheel.

This does not mean that earthquakes are totally unpredictable, but that, like all strongly nonlinear phenomena, they can only be predicted in probabilistic hazard terms, which are at the root of the basic defensive tool against earthquake damage: seismic codes.

Seismic codes exist in all countries and are cast in a variety of forms. Since damage is essentially suffered by the oldest and poorest constructions, the differences among the codes are usually less important than the fraction of the building stock they rule. In many countries, like Italy, progressively stricter seismic codes have been introduced in recent years, but they still apply only to new or refurbished construction, while old buildings (which in the center of most cities date back several centuries) are excluded, with a highly critical case represented by historic and artistic monuments.

In general, hazard estimates guide earthquake preparedness, which has in turn many facets: it starts with building codes and general civil protection plans and ideally extends to immediate actions like evacuation. Both the Port-au-Prince and l’Aquila events emphasized the fact that our hyper-technological era is still largely impotent against destructive earthquakes. The l’Aquila earthquake has also exposed with peculiar clarity the fact that no miracles can be expected from seismological knowledge, and in particular that seismic hazard studies are useless in guiding short-term actions. Here is why.

A seismic hazard study was performed and published in the Bulletin of the Seismological Society of America by Boschi, Gasperini, and Mulargia in 1995. The technical bases of that work were clear and simple. Seismicity was modeled by both the asymptotic physical behaviors of 1) full regularity—a Gaussian process; and 2) total randomness—a Poisson process. Applied to the independently defined 54 Italian seismotectonic regions, this approach led to the result that two regions had an estimated probability of occurrence for an M ≥ 5.9 event close to 1.0 in the following five years. These were the l’Aquila region and southeastern Sicily.

However, hazard studies suffer from a number of problems (see Mulargia and Geller 2003). First, they lack a solid physical basis and are semi-empirical in nature: most formulations use statistical aphorisms like “return times” in Poissonian processes, or rely on chimeras like “time dependent” strain recharge mechanisms, or invoke identical recurrences of “characteristic” earthquakes, or, in some recent formulations, describe a clustering that is defined on merely phenomenological grounds. Second, these semi-empirical approaches are obviously focused on large events—the only ones interesting in practice—which have very low occurrence rates and are therefore exposed to inefficient statistical inference. Third, the time-dependent hazard approaches have to cope with the Gutenberg-Richter law, which turns any apparently strong clustering on small events into inevitably low probability levels, on the order of at most 10−4 per day for the “large” events (see Kagan and Jackson 1994).

The first result of this state of matters is a limited reliability of the different hazard issues, which is acknowledged by giving them comparable consideration and attempting an application of democracy to science through “logic trees.” In quantitative terms, this means effective probabilities much lower than their face value: in the l’Aquila case, since the estimated probability was already ~1.0 15 years before the time of real occurrence, the effective probability can be a posteriori set at 2 × 10−4 per day, or 10 × 10−5 per hour, comparable to those commonly inferred from clustering. Such values are too low to support any action disrupting societal life.

The l’Aquila earthquake uniquely illustrates this issue: the first author of that hazard study has been since the time of publication the president of the Istituto Nazionale di Geofisica e Vulcanologia, the national organization that controls most of Italian geophysics and is responsible for the seismic surveillance of Italy. Yet in spite of this leadership position and the persistence of an earthquake swarm that had been affecting the region of l’Aquila since January, no practical action was taken: the damage that an evacuation would have inevitably caused was judged too high compared with the low effective probability of event occurrence.

Hence, the usefulness of hazard studies at the time scale of a building’s lifetime, 102 years, is evidently countered by their lack of usefulness at the time scale of society, 10−2 ~ 10−1 years (i.e., days to months). On the positive side, the l’Aquila earthquake suggests that we may seek hazard study usefulness at the intermediate time scale of 100–101 years (or 1–10 years). In fact, both the Haiti and l’Aquila events dramatically reminded us that seismic destruction derives from the combination of two basic ingredients: seismic hazard and vulnerability. In turn, vulnerability is the result of local amplification and poor construction. The l’Aquila earthquake has firmly confirmed the predominance of the former, mostly at 1-D, with villages near the epicenter that were a mere kilometer apart suffering either total destruction or minor damage on identical buildings sited, respectively, on sediment or on fresh rock. Another striking example of site effects is the destruction of the historic city of l’Aquila, mostly originated by a patchy subsoil structure, with strong resonances at either 3Hz or 0.6Hz, with the latter produced by a rigid surface lying over softer layers (De Luca et al. 2005).

Retrofitting old and historic structures to withstand earthquake stresses is essentially a problem of cost. While reinforcing all structures at risk is generally unaffordable, the “comparatively-high-but-practically-too-low” probability levels of hazard studies can be used to target seismic vulnerability studies at the level of single buildings, experimentally studying the dynamic soil-structure behavior, which appears to be a most critical factor in determining structural damage (e.g., Erlingsson 1999).

In fact, the modal analysis of both soil and buildings is now made quick and inexpensive by passive (or natural input) techniques, which use seismic microtremor as a random broadband excitation function to measure the dynamic response (e.g., Lachet and Bard 1994; Huang and Lin 2001). Such an analysis can effectively identify the most vulnerable structures and guide reinforcement where it is the most urgent, substantially extending the long-term time-independent hazard estimates and at the same time giving a raison d’être to time-dependent hazard estimates.   

Francesco Mulargia Dipartimento di Fisica, Settore di Geofisica Università di Bologna Viale Berti Pichat 8 40127 Bologna, Italy f [dot] mulargia [at] gmail [dot] com

REFERENCES

Ben-Menahem, A. (1995). A concise history of mainstream seismology: Origins, legacy, and perspectives. Bulletin of the Seismological Society of America 85, 1,202–1,225.

Boschi, E., P. Gasperini, and F. Mulargia (1995). Forecasting where larger crustal earthquakes are likely to occur in Italy in the near future. Bulletin of the Seismological Society of America 85, 1,475–1,482.

De Luca, G., S. Marcucci, G. Milana, and T. Sanò (2005). Evidence of low-frequency amplification in the city of L’Aquila, central Italy, through a multidisciplinary approach including strong- and weakmotion data, ambient noise, and numerical modeling. Bulletin of the Seismological Society of America 95, 1,469–1,481.

Erlingsson, S. (1999). Three-dimensional dynamic soil analysis of a live load in Ullevi stadium. Soil Dynamics and Earthquake Engineering 18, 373–386.

Huang, C. S., and H. L. Lin (2001). Modal identification of structures from ambient vibration, free vibration, and seismic response data via a subspace approach. Earthquake Engineering and Structural Dynamics 30, 1,857–1,878.

Kagan, Y. Y., and D. D. Jackson (1994). Long-term probabilistic forecasting of earthquakes. Journal of Geophysical Research 99, 685–700.

Lachet, C., and P.-Y. Bard (1994). Numerical and theoretical investigations on the possibilities and limitation of Nakamura’s technique. Journal of the Physical Earth 42, 377–397.

Mulargia, F., and R. J. Geller (2003). Earthquake Science and Seismic Risk Reduction. Dordrecht and Boston: Kluwer Academic Publishers, 338 pps.

 


To send a letter to the editor regarding this opinion or to write your own opinion, you may contact the SRL editor by sending e-mail to
<lastiz [at] ucsd [dot] edu>.



[Back]

 

Posted: 30 April 2010