OPINION

March/April 2011

The Magnitude of the Problem

doi:10.1785/gssrl.82.2.167

“On Monday morning last, about a quarter past two, St. Louis and the surrounding country, was visited by one of the most violent shocks of earthquake that has been recorded since the discovery of our country.” (Louisiana Gazette, 21 December 1811)

This is where the story begins, in the wee hours of the morning on 16 December 1811. Many early Americans, in particular in the sparsely populated midcontinent, were awakened that night by seismic waves from the powerful mainshock and kept awake by a robust aftershock sequence. Of those who experienced the shaking, only a small percentage documented their experiences in written accounts that have been handed down to us. The collection of known archival accounts numbers only in the hundreds. Some of the accounts appeared to defy credulity: waterfalls appearing on the Mississippi River, the course of the river itself temporarily reversed, riverbanks and even entire islands collapsing. Other accounts reveal remarkable insight, even prescience; for example, medical doctor Daniel Drake, who observed that shaking was stronger in the Ohio River Valley than in the adjacent uplands and went on to ascribe the difference to the fact that strata in the river valley are “loose.” Samuel Mitchill, a representative in the U.S. Congress with training in geology, set out to collect accounts in the hope that they would lead him to “something like a tolerable theory of earthquakes.” Along the way he came to appreciate the challenge of the task he set himself. “The phenomena,” he wrote in 1815, “were described in the most fearful and alarming strains by several writers. Much exaggeration was interwoven with some of the narratives. Some, indeed, were tinctured with fable and burlesque.” (Many of the accounts that likely impressed Mitchill as fable and burlesque have in fact found support in modern science: for example the waterfalls, which are consistent with the rupture scenario pieced together from a remarkable marriage of modern science and detailed archival accounts.) To arrive at what he considered reliable accounts Mitchill focused on independent eyewitness accounts that told a consistent story.

Some of the accounts appeared to defy credulity: waterfalls appearing on the Mississippi River, the course of the river itself temporarily reversed, riverbanks and even entire islands collapsing.

In the end Mitchill admitted, with palpable disappointment, that he had failed in his goal of developing a general theory of earthquakes. But he added, “although materials may yet be wanting for a perfect theory, it is a matter of some consolation to have assembled into one body, the phenomena of the most memorable earthquakes that ever agitated these parts of North America, and to have made a record of them for my sagacious and fortunate successors.”

Sagacious and fortunate successors, we. The modern seismological community has inherited the fruits of Mitchill’s and others’ labors along with a framework of understanding that Mitchill himself could not have imagined. We can dismiss early notions of earthquakes being associated with subterranean fires; we know that the 1811–1812 New Madrid earthquakes were caused by motion on a system of faults that released stored stress. We know the sequence was preceded by at least two similar sequences, circa A.D. 1450 and 900. We know that the Reelfoot fault in particular has hosted multiple large earthquakes during the Holocene, and that many faults appear to have highly episodic activity over millions of years. We also have evidence for Quaternary activity in the vicinity of the central New Madrid seismic zone. We have the tolerable (general) theory of earthquakes that Mitchill was seeking, but still we trip over a question that Mitchill didn’t recognize as a problem: Why has seismic activity been concentrated in the New Madrid seismic zone over the past few thousand years? With the geodetically constrained strain accrual rate appearing to be quite low, how can the zone produce M 7.5+ earthquakes every 400–500 years? (Published magnitude estimates for the largest mainshocks have ranged from M 7 to as high as 8¾. In the 2008 version of the National Seismic Hazard maps, M 7.7 is given the highest weight in a logic-tree approach, with lower weight given to values as low as M 7.3 and as high as M 8.0.)

Localized sources of stress, ancient stored stress … a number of clever ideas have been proposed in recent years to reconcile the seemingly irreconcilable. Some have suggested that significant strain could still be hiding within the relatively sparse GPS constraints in the midcontinent. One answer to the question is that the 1811–1812 New Madrid earthquakes were, in fact, considerably smaller than M 7.5, closer to or even slightly below M 7. Values this low are not inconsistent with the available constraints, including the macroseismic intensities, the extent of liquefaction, and inferred mainshock rupture parameters.

On the other hand, much higher values can (and have) been inferred assuming high stress drop sources and rupture parameters extended in every direction. Values approaching M 8 have found support from analyses of intensities. The truth is, while macroseismic intensities provide the most direct constraint on magnitudes, they are a poor constraint, in particular because we can only compare them to intensities from significantly smaller instrumentally recorded calibration events. This is the fundamental problem that has plagued every investigation of intensities dating back to the seminal work by Otto Nuttli in 1973. The principal New Madrid earthquakes were bigger than any central and eastern U.S. earthquakes that have occurred over the last 100 years. But how much bigger? This is not a problem that mathematical cleverness alone is likely to ever solve.

Perhaps it is time to follow Samuel Mitchill’s lead: to focus on what we can say with confidence. Perhaps it is time to admit that size does not, in fact, matter. Were the principal New Madrid earthquakes closer to M 8 than M 7? My own recent work suggests the latter. With Occam’s Razor as a guiding principle it is an appealing solution because it reconciles the apparently irreconcilable with a minimum of appeal to special circumstance, i.e., the moment release rate can be reconciled with observational constraints and theoretical predictions about the strain accrual rate. Of course this does not mean it’s necessarily the right solution. But what matters for hazard is not just magnitude, or attenuation, or site response— what matters for hazard are the resulting ground motions. We might be left with significant uncertainties regarding all three key ingredients, and thus significant uncertainty about hazard. But it is important to remember that where ground motions are concerned, thanks to the efforts by Mitchill and others, we have some actual ground truths.

The 1811–1812 New Madrid earthquakes did not shake any modern engineer SRLed structures, and one key unknown that remains is the severity of shaking in the near-field. The extent and severity of liquefaction points to strong shaking, but how strong? That this remains an open question is not hugely shocking: we have little-to-no information about near-field shaking for a number of very important, very recent earthquakes, including the 2001 Bhuj, India, and 2010 Haiti events.

For the New Madrid earthquakes we have some fairly robust constraints on the shaking levels throughout the midcontinent and along the Atlantic seaboard. These constraints reveal variability in ground motions that in many cases are consistent with general expectations for ground motions in an intraplate (high-Q) setting with considerable site amplification along coastlines and major river valleys:

  • Shaking was weakly felt to distances of at least 1,700 km.
  • Among the energetic aftershock sequence were a number of identifiable events with estimated magnitudes of M 6–6.3.
  • Light damage occurred to distances of 900–1,000 km.
  • Shaking along the Mississippi River Valley caused widespread disruption and failure of riverbanks.
  • Shaking along the Ohio River Valley caused damaging shaking (MMI VI) to distances of about 800 km. In Louisville, Kentucky, for example, shaking damaged gable ends and parapets and knocked down many chimneys. Away from the immediate river valleys shaking was 1–2 MMI units lower.
  • In St. Louis, about 300 km from the New Madrid seismic zone with a (county) population of about 5,700, the eyewitness account of the first mainshock concludes that, “No lives have been lost, nor has the houses sustained much injury, a few chimneys have been thrown down, and a few stone houses split.”
  • In St. Genevieve, a settlement 75 km south of St. Louis that had been moved away from the Mississippi embayment (onto presumed hard rock) because of a flood in the late 18th century, the earthquakes were felt but reportedly caused no damage.

When researchers seek to predict the effects of a large earthquake in a lowstrain- rate region, they generally grapple with some substantial unknowns, starting with the question, “What is the appropriate Mmax to assume?” For New Madrid we are ahead of the game because it is generally assumed that we have witnessed the maximum magnitude (Mmax) event, and we know something about the shaking effects it caused.

The principal New Madrid earthquakes were bigger than any central and eastern U.S. earthquakes that have occurred over the last 100 years. But how much bigger? This is not a problem that mathematical cleverness is likely to ever solve.

Of course size does matter. Macroseismic observations from the early 19th century don’t provide much illumination of long-period shaking, which could impact today’s big buildings and structures (bridges, for example.) In any prediction of ground motions from scenario events, however, it is critical to close the loop: to compare the effects of the predicted next New Madrid Big One with the ground truths from the last New Madrid Big One(s). To a reasonable approximation, the loop needs to close. The predicted low- and moderate-intensity field for a scenario event should not, for example, be grossly different from what happened in 1811–1812.

A further question has been raised regarding New Madrid hazard, namely whether, given the low apparent observed strain rate, we need to worry about the next New Madrid Big One at all. There is, I think, little disagreement that the Holocene rates of New Madrid seismic zone activity have not been continuous through earlier geological times and are unlikely to continue indefinitely. Post-glacial rebound (PGR) is an attractive explanation for clustered Holocene activity because 1) it explains why the New Madrid seismic zone has turned on in the last few thousand years, and 2) work by Steffane Mazzotti and colleagues (Mazzotti et al. 2005) shows that PGR can account for activity along the St. Lawrence seaway. If postglacial rebound is driving seismic activity now, clearly at some point it will stop. But models of post-glacial rebound predict that the strain rate will decay quite slowly, remaining at present levels for at least another 10 ka or so. Further, while it is a problem to find enough strain to account for M 7.5+ earthquakes every 500 years, it is not a problem, at least within the current constraints, to account for an M 6.8–7 earthquake every 500 years. Presumably after the tragic experiences in Haiti of 12 January 2010, nobody needs to be convinced of what a “measly” M 7.0 earthquake can do in a region that is ill-prepared for earthquakes.   

REFERENCES

Mazzotti, S., T. S. James, J. Henton, and J. Adams (2005). GPS crustal strain, postglacial rebound, and seismic hazard in eastern North America: the Saint Lawrence valley example. Journal of Geophysical Research 110, doi: 10.1029/2004JB003590.

Susan E. Hough hough [at] usgs [dot] gov

 


To send a letter to the editor regarding this opinion or to write your own opinion, you may contact the SRL editor by sending e-mail to
<srled [at] seismosoc [dot] org>.



[Back]

 

Posted: 20 December 2010