SSA 2011 Annual Meeting
The following sessions have been organized for the 2011 Annual Meeting program.
Click session titles to view the abstracts and schedule for that session.
This session is organized jointly by SSA and the European Seismological Commission (ESC).
Evidence of large earthquakes before the advent of modern seismometers is derived from historical, geological, and archaeological records. In this session, we explore the field of archaeoseismology—the study of the evidence of ancient earthquakes at archaeological sites. Archaeological records have the potential of yielding valuable data relevant to the timing of ancient earthquakes as well as to intensity of ground shaking and parameters of fault rupture. Many archaeological sites contain abundant material useful for precise dating and regional correlation of earthquake effects that are necessary for estimating the timing, locations, magnitudes, and recurrence of large earthquakes. Because many of them have been occupied over a long period of time, archeological sites may have experienced multiple cycles of seismic activity and contain records of long-term behavior of faults. In addition, earthquake-related ground failure may disturb the archeological record. The challenge for archaeoseismic research is to decipher the evidence of ancient earthquakes within this archaeological-geological context. This session explores the records of earthquakes at archaeological sites and methods for quantifying seismic hazard parameters from archaeoseismic data. Papers from around the globe, including Asia, Europe, the Middle East and Mediterranean, as well as the Americas will be presented. This session sponsored by International Geoscience Programme IGCP 567 "Earthquake Archaeology."
Tina M. Niemi
<niemit [at] umkc [dot] edu>
Klaus -G. Hinzen
<hinzen [at] uni-koeln [dot] de>
Martitia P. Tuttle
<mptuttle [at] earthlink [dot] net>
The use of paleoseismic studies for the seismic hazard in regions of low-to-moderate seismicity has increasingly become formalized. The focus of this session will be on the new developments and/or applications of paleoseismic studies for assessing the seismic hazard. Particular emphasis of this session is the interdisciplinary collaborations among geologists, geotechnical engineers, and seismologists on paleoliquefaction studies.
Russell A. Green
<rugreen [at] vt [dot] edu>
Scott M. Olson
<olsons [at] uiuc [dot] edu>
In recent years, significant progress has been achieved in both observation and theoretical modeling of earthquake strong-ground motions. Unfortunately, the database of recorded near-fault ground motions from large to great earthquakes is still very limited, especially in regions where these types of earthquakes are most likely to occur. Recent advances in physics-based deterministic and stochastic ground motion modeling approaches have made it possible to simulate realistic broadband time series, which can be used to augment the sparse recorded data, as well as for engineering design applications. The aim of this session is to examine the status of different techniques used to generate broadband synthetic seismograms, and to document their limitations and usefulness.
This session will cover many aspects of broadband strong ground motion simulation, including kinematic and dynamic earthquake rupture characterization, deterministic ground motion simulation, and high frequency synthesis techniques. Papers will also be presented on criteria to quantify the goodness-of-fit between observed and synthetic seismic data, examples of applications to earthquake engineering problems and computational platforms that address end-to-end time series generation.
<lramirezguzman [at] usgs [dot] gov>
<mmoschetti [at] usgs [dot] gov>
<zeng [at] usgs [dot] gov>
<rwgraves [at] usgs [dot] gov>
Combining geodetic and seismic measurements provides the potential to expand the spectrum of signals that can be observed at a site, enhance our understanding of the earthquake process and develop in-situ calibration methods for borehole instruments. The combination of tilt and seismic measurements in Japan, strainmeter recordings of Episodic Tremor and Slip strain pulses in Cascadia and the measurement of slow earthquakes in central California are examples of the types of signals borehole instruments have captured well. Combining borehole geodetic, seismic and GPS measurements would enhance our understanding of how strain transients develop. With higher sample rates, improved sensitivities and the increasing number of integrated geodetic and seismic networks around the world, the task of combining the different data sets to obtain new insight on the temporal and spatial evolution of high to ultra low frequency signals becomes a pressing challenge. In this session, we will present papers on combining GPS, seismic and borehole geodetic measurements for the purposes of data processing, down hole instrument calibration and modeling of geophysical signals. The session also covers several aspects of dealing with integrated networks including installation operation, and, tasks associated with handling the large volumes of data generated by these networks.
<mencin [at] unavco [dot] org>
<hodgkinson [at] unavco [dot] org>
Charles A. Langston
<clangstn [at] memphis [dot] edu>
This session is comprised of a series of talks and posters related to seismic instrumentation, large scale seismic networks, methods for calibrating seismic arrays, and geodetic methods for analyzing seismic wavefields.
Earth structure observational studies, theoretical elastic and anelastic effects on wave propagation, and ground motion studies related to site classification.
Seismic hazard assessment is an inherently difficulty problem because not all physical, geological, and statistical aspects of earthquake generation, nucleation, and occurrence are well understood. Hazard assessment employs many concepts that, even after decades or research, remain contentious. Invited speakers with strong positions will present their case on these issues:
- Should earthquake early warning be implemented and the public be educated for its use? Is early warning ready for prime time or is its premature use creating new unsolved problems and even increasing the danger? [View Abstracts]
- Is testing and evaluating earthquake forecast models a worthwhile effort? Will the information gain of such efforts be large enough in the next years to decades that new constraints could be put on seismic hazard assessment? Is forecast testing a good way to evaluate hypotheses of earthquake behavior, or does it force earthquake information into unnatural formats? [View Abstracts]
<ds [at] usc [dot] edu>
David D. Jackson
<david.d.jackson [at] ucla [dot] edu>
<warner [dot] marzocchi [at] ingv [dot] it>
Matthew C. Gerstenberger
<m [dot] gerstenberger [at] gns [dot] cri [dot] nz>
Observational studies are presented that give clues to how and when earthquakes are triggered by natural and unnatural stress changes due to volcanic eruption, fluid injection, or through wave propagation of large-amplitude seismic waves.
Episodic and Complex Behavior of Faulting and Seismicity in Continental Intraplate Regions - Implications for Seismic Hazard Maps
Recent geological and numerical modeling studies suggest that faulting and seismicity in continental intraplate regions may follow complex episodic patterns, with spatial migration along large geological structures and sporadic periods of activity followed by long periods of quiescence. The implications of such potential behavior on seismic hazard assessment are poorly understood, but could result in significant modifications to current hazard models. This session covers various aspects of the complexity of continental intraplate faulting through geological, seismic, geodetic, crustal rheology, and numerical modeling studies. Some presentations will compare and combine multi-disciplinary data while others explore the impact of complex and episodic seismicity on seismic hazard assessment.
<Mark [dot] Leonard [at] ga [dot] gov [dot] au>
<smazzotti [at] nrcan [dot] gc [dot] ca>
Geometry Effects in Ground Motion: Focusing, Scattering and Waveguides of Seismic Rays in the Near-surface
Increasingly, seismic hazard assessment and microzonation studies are relying on wave motion stimulations to estimate the distribution of expected ground motions. Observations from large earthquakes have shown that -- among other factors -- the presence of irregular geomorphic and geologic structures can significantly aggravate the catastrophic consequences of seismic motion as a result of preferential focusing, scattering and/or trapping of seismic energy. Examples of such 'geometry' driven effects include significant spatial variability of seismic intensity over small distances, increase of duration in expected shaking, and the altering of frequency content including amplification. In practice, however, typical, seismic code provisions and microzonation studies do not account for these effects despite the documented evidence of their role in elevating seismic risk. the purpose of this session is to facilitate the dissemination of recent advances in the understanding, monitoring, modeling and simulation of geometry-related effects in ground motion, and to focus attention on the implications of these effects in seismological research and engineering design. In particular, specific topics include, but are not limited to, numerical-modeling methods that account for topographic effects, basin wedge effects, near-surface waveguides or reflectors, and observations of ray focusing and defocusing and their correlation with structural damage intensity and spatial distribution, as well as procedures to account for geometry effects to be employed in seismic hazard assessment and mitigation procedures.
<dominic [at] gatech [dot] edu>
<yong [at] usgs [dot] gov>
<adrianrm [at] vt [dot] edu>
Papers in this session focus on observations, simulations and back-analyses of geoengineering case-studies from recent significant earthquakes, including the 12 January 2010 Mw7 Haiti Earthquake, the 27 February 2010 Mw8.8 Chile Earthquake, the 4 April 2010 Mw7.2 Sierra El Mayor Earthquake in Baja California, and the 03 September 2010 Mw7.0 South Island of New Zealand Earthquake. Multiple geotechnical engineering phenomena were documented during the post-event reconnaissance of these events, including liquefaction and lateral spreading, site effects, topographic effects, landslides, and failures of foundations and retaining structures. The focus of this session is the dissemination of case-studies involving, for example, analyses of macroseismic observations, geotechnical investigations, mainshock and aftershock recordings, and remote sensing imagery; as well as numerical simulations and simplified analytical models, that can be used by the engineering and seismological communities to evaluate the effectiveness of established predictive models and design procedures, and accordingly improve existing hazard prediction and mitigation strategies for future events.
<dominic [at] gatech [dot] edu>
<hough [at] gps [dot] Caltech [dot] edu>
Ground Motion Attenuation Modeling: Functional Form, Input Parameters, Standard Error, and Testing Criteria.
Various ground motion attenuation models, often called ground motion prediction equations (GMPE), have been developed in recent years for different tectonic environments. Because of their importance for engineering applications, dozens of papers on this area were recently published in engineering and seismology journals and conference proceedings. Approaches to ground motion attenuation modeling vary significantly between developers including differences in the approximation of the attenuation function (functional form), the selection of geophysical input parameters (fault parameters, travel path, Vs, basin depth, etc.), and the modeling of the standard error for prediction.
Presentations from a wide group of earthquake engineering professionals (model developers and users) will discuss the bases of approaches to ground motion attenuation model developments including:
- Approximation functions (functional forms),
- Set of model input parameters characterizing fault, path and site (we encourage participants to make a list or table of input parameters used in the model),
- Modeling standard error including regions where observational data is limited
- Testing and verification criteria.
This discussion is timely because of a number of accomplished and ongoing studies, such as next generation attenuation projects: NGA-West, NGA-West-2, and NGA-East involving researchers and data from around the world.
<Vladimir [dot] Graizer [at] nrc [dot] gov>
<ccramer [at] memphis [dot] edu>
This session will cover issues related to "Ground Motion Scaling and Selection." Design acceleration response spectrum is generally used by engineers for linear structural analysis of typical regular structures. The design spectrum for a given site is typically obtained from a uniform hazard spectrum (UHS). Whilst the starting point for seismic design may be a UHS produced by probabilistic seismic hazard analysis (PSHA), many researchers and practitioners are now arguing that the UHS itself is not an appropriate basis for design. It is preferable to disaggregate the hazard and then construct a scenario response spectrum, or even a conditional mean spectrum.
Time history analysis may be required when nonlinear performance of a structure needs to be addressed. Instances that require time history analysis include very tall or long structures, complex buildings with extreme mass and/or geometric irregularities, structures with base isolation or supplementary damping devices, structures designed for high ductility demand, and particularly critical structures for which any damage has potentially far-reaching consequences in terms of safety. One way to obtain the needed time histories is to generate artificial ground motions to match the target design spectrum. Alternative approach would be to select a suite of real ground motions from analogous past seismic events recorded in similar site conditions. However, there is not much chance of finding real ground motions that their response spectra coincide with a desired response spectrum. This session will discuss recently developed procedures for ground motion scaling and selection.
<spezeshk [at] memphis [dot] edu>
The purpose of this session is to lay the foundation for a written guide to sustained seismic network operations. Sustaining the operations of a seismic network is a complex and challenging task. This challenge can be cast into four major elements: (1) establishing a base (or bases) of support, (2) network installation, operations and maintenance, (3) routine data processing, and (4) information and data dissemination. The last element should feed back into the first, so that the supporting groups see the value in network and its products. This session explores the practical issues in building a sustainable seismic network and presents case studies of sustainable networks operations and experiences in various countries and situations. The true test of any network often comes during an earthquake or volcanic emergency. On the compressed time scale of a volcanic crisis or earthquake disaster recovery, it can be difficult to find a reliable and yet disinterested source of information. If a network fails to meet expectations in a crisis long-term support could be endangered. The challenges of network operations in various emergency and post emergency situations will be emphasized. The session also presents examples of advanced technologies and procedures that may be applied in future network operations.
The oral presentations will end with a discussion period on the contents of a guide to sustainable networks during which the views, opinions, and experiences of the speakers and audience will be most welcome.
Stephen D. Malone
<steve [at] ess [dot] washington [dot] edu>
John R. Filson
<jfilson [at] usgs [dot] gov>
An earthquake loss estimation model, developed with the vulnerability of the affected building stock at its center depends on good, reliable data for its success. Since the advent of HAZUS, there have been considerable efforts within the US and abroad to improve on ways of capturing, classifying, and analyzing building stock inventory and their vulnerabilities. In addition, researchers around the world have also explored social vulnerabilities, depicted in part by the consequences of building collapses on its occupants. Disasters in the past have clearly highlighted the need to improve on data quality for effective catastrophe risk management. The aim of this session is to collate and draw on some notable recent efforts in this research area, in particular from FEMA in its updates for HAZUS, the USGS PAGER’s initiative, EERI’s World Housing Encyclopedia framework and the Global Earthquake Model (GEM), among others. This session will be used to highlight new approaches to improving inventory and vulnerability of physical and social data for use in earthquake loss modeling.
<kjaiswal [at] usgs [dot] gov>
<eso [at] usgs [dot] gov>
<douglas [dot] bausch [at] dhs [dot] gov>
<helen [dot] crowley [at] globalquakemodel [dot] org>
Integrating Geodynamic, Structure and Deformation Studies of the Seismogenic and Transition Zones in Subduction Zones and Other Margins
Subduction zones are a primary feature of the dynamic Earth that produce the largest earthquakes in the historic record as well as earthquakes ranging from simple to complex, extending to over 600 km depth. Despite the hazard implications, the processes that control seismogenesis have remained enigmatic. The growth of observational networks and computational capabilities over the past decade provide new opportunities to investigate subduction zone earthquakes and transitions in lithospheric and seismogenic behavior in greater detail. This session includes presentations that integrate observations of slip processes with geodynamic modeling and spatial variations in earth structure of subduction zones and other margins. A range of source process studies on recent great earthquakes, intermediate and deep earthquakes, and the new family of slow earthquakes and tremor will provide valuable contributions.
Michael R. Brudzinski
<brudzimr [at] muohio [dot] edu>
<hdeshon [at] memphis [dot] edu>
Occurrences of large earthquakes in continental plate interiors is poorly understood and not accounted for in the current plate tectonic theory. Understanding the long-term behavior of faults is key to constraining models of seismogenesis and assessing earthquake hazards in intraplate regions. This session focuses on recent advances in deciphering and modeling the geological and geophysical records of fault behavior in intraplate settings and their implications for seismogenesis and earthquake hazard assessment. Papers will be presented in the fields of geomorphology, neotectonics, paleoseismology, geodesy, and high resolution imaging for regions including Australia, Europe, China, India, Mongolia, Canada, and the United States.
M. Beatrice Magnani
<mmagnani [at] memphis [dot] edu>
<mptuttle [at] usgs [dot] gov>
The Great Tohoku-Oki Earthquake of March 11, 2011 (Mw 9.0) was the largest earthquake ever recorded in Japan and one of the largest earthquakes to occur in the last century. The potential implications of this event for our understanding of great earthquakes, tsunamis, ground shaking and the many different responses to the ensuing destruction are unprecedented, partly due to the abundance of high quality data available. The Christchurch (Lyttelton) Earthquake of February 21, 2011 (Mw 6.3) also provided a wealth of information, especially regarding the impact of high intensity ground motion in an urban environment. We invite contributions that address scientific aspects of either of these two earthquakes. Potential topics include but are not limited to earthquake source modeling, tsunami observations, geodetic constraints, strong motion, tectonic setting, building response, implications for seismic hazards, and global earthquake triggering.
<tlay [at] ucsc [dot] edu>
<vtsai [at] post [dot] harvard [dot] edu>
<ghayes [at] usgs [dot] gov>
<jrubinstein [at] usgs [dot] gov>
Microzonation and urban seismic hazard mapping efforts have been in the forefront of interfacing geoscience and engineering with public and private policy and decision making. In the U.S. urban hazard mapping projects have been ongoing for a decade in Memphis, TN, Seattle, WA, Oakland, CA, St. Louis, MO-IL, Evansville, IN, and most recently in the Salt Lake City, UT region and Reno, NV metropolitan area. In Canada, Europe, and around the world, as well as the U.S., microzonation projects have been conducted for decades. Many approaches and varying levels of detail have been employed in microzonation and urban hazard mapping. Different efforts worldwide can benefit from reviewing what has been successful, in terms of scientific approach, response by the local community to the hazard assessments, post assessment community land use changes, and discussing what are the future needs and directions in this field. Criteria for prioritizing future areas for detailed earthquake shaking and liquefaction hazard studies are also needed to help funding agencies make cost-effective decisions. This session will cover many aspects of microzonation and urban hazard mapping with an emphasis on current efforts and future directions.
<ccramer [at] memphis [dot] edu>
<rawilliams [at] usgs [dot] gov>
A recent trend in the Earth's structure modeling arena at various scales is the multivariate inversion of traditionally distinct data sets for improved seismic structure modeling. Combinations of data sets used in these multivariate inversions have included: resistivity and magnetotelluric data; receiver functions and surface wave dispersion observations; teleseismic or local travel times and gravity data; and surface wave velocity and gravity observations among others. Although multiple geophysical observations have been successfully inverted jointly, many questions about means and methods still remain unanswered. This session includes presentations on simultaneous and sequential multivariate inversion methods for improved seismic structure modeling. Some will highlight novel combinations of data sets, relationships between the independent observations and the relative weighting of disparate data sets for successful inversion. Also included are results from reservoir scale to global scale and new means to address computational efficiency and robustness of the inversion.
<mmaceira [at] lanl [dot] gov>
<hjzhang [at] mit [dot] edu>
<char [at] lanl [dot] gov>
The New Madrid Seismic Zone: Our Understanding on the 200th Anniversary of the New Madrid Earthquake Sequence
December 16, 2011, will mark the 200th anniversary of the beginning of the New Madrid earthquake sequence, a stunning cluster of three or four M7-8 earthquakes that occurred over a 54 day time span in the sparsely populated Louisiana Territory, about 240 km south of St. Louis. Ground shaking from these mainshocks was felt over the entire eastern United States and awakened people in the middle of the night up to 1400 km from the epicenters. The aftershock sequence was also impressive, with estimates of several M5-M6 earthquakes and hundreds of M3-M4’s lasting for months after the mainshocks. Research over the past three decades has shown that this sequence is not an isolated event and that the New Madrid region has produced sequences of major earthquakes in 1450 A.D. and 900 A.D. These prehistoric earthquakes caused severe and widespread ground failures much like those caused by the 1811-1812 earthquake sequence. Studies have also shown that large Holocene earthquakes have occurred within the Mississippi embayment, but outside the current main microseismicity trends. To many researchers in the region, the New Madrid seismic zone poses a tremendous seismic hazard to the entire central U.S.;a repeat of this sequence would expose some 1 million people to Mercalli Intensity VII effects. Some researchers, however, have argued for lower seismic hazard and lower likelihood of repeats of an 1811-1812-type sequence.
In a series of invited summary talks, this session showcases our state of knowledge of the New Madrid seismic zone, with a focus on studies of the fault zone including the extensive sand-blow area, crustal structure, personal accounts of the ground shaking, the enigmatic causes of earthquakes in this intraplate setting, and evidence for other faults surrounding the main seismicity trends that appear to have been active during the Quaternary. Contributed papers describe recent research bearing on assessments of the hazard and the risk posed by the fault zone.
<schweig [at] usgs [dot] gov>
<rawilliams [at] usgs [dot] gov>
The work of building integrative (data and physical processes), reproducible, and internally consistent seismic hazard assessment methods seems harrowing. We have gathered people who have already developed parts of the puzzle.
Many probabilistic approaches are covered. In particular, the Bayesian methods seem well indicated for the task because data can be explained using generative models (physics-based or not), and, given proper priors, models' plausibility may be compared, leading to model selection or combination. The expert judgment can therefore move "upstream" from the model combination step up to the choice of the priors for example, leading to a much-needed reproducibility of the resulting hazard products.
This session will deal with a probabilistic treatment of ingredients of hazard computations. e.g., GPS deformation models, past fault slip rates, fault geometry, seismotectonic zoning, rupture models of past earthquakes, fault and crust rheological models, earthquake simulators, scaling relationships (slip vs length, etc), frequency-magnitude scaling, choice of renewal models using real or synthetic data, computation and choice of ground motion prediction equations &hellip. Contributions on 1)why it is hard to go probabilistic for certain types of work (and why it would be valuable to do so nonetheless), and 2) how to test the derived predictive models, are also included, as they will help identify areas which need more work.
<D [dot] Rhoades [at] gns [dot] cri [dot] nz>
<delphine [at] uevora [dot] pt>
This session asks how should seismologists communicate with the public in addition to reviewing interdisciplinary efforts in forecasting earthquakes and adoption of building codes. Two presentations are given on accessing seismological data products for engineering and scientific use.
Recent Advances in Understanding Scaling Characteristics: How Similar Are Small and Large Earthquakes?
Over the past two decades, the scaling of the earthquake rupture process and in particular whether radiated energy behaves self-similarily among small and large earthquakes, characteristics that are fundamental for enhancing our understanding of rupture physics, have been a matter of partially vigourous debate. Since the ground-breaking work of Keiiti Aki in the late 1960ties, the static scaling relation between seismic moment and some length scale characterizing the earthquake source (for which corner frequency is commonly used as a proxy) has been widely accepted, at least for earthquakes with magnitudes larger than about 3. On the other hand, the dynamic scaling of apparent stress with moment is still highly controversial throughout the entire magnitude range, from very small (M<0) to very large events (M 8 and above), and potential sources of bias such as bandwidth limitations or attenuation correction have been invoked that significantly complicate the situation. Moreover, the comparison between different studies is hampered by the fact that often different methods of analysis and/or wavefield components (i.e. direct or coda waves) are used and that many datasets employed so far covered only relatively narrow magnitude ranges. However at the same time, the sheer existence of both slow and super-shear earthquakes, for which compelling evidence has been provided, shows that the rupture dynamics, at least for large events, must span a considerable range. The aim of the session is to provide an up-to-date overview of the recent advances on the topic and to identify the challenges and routes to follow for that the seismological community may come closer to solving this issue in the years to come.
This session covers earthquake scaling characteristics on all scales, from small mining-induced up to giant subduction zone events, and in particular the relationship between static and dynamic scaling laws and their implications for the earthquake rupture process. We furthermore encourage contributions dealing with the effects of potential sources of bias, comparative studies applying different methods to estimate the relevant earthquake source parameters and their impact on the scaling relationships, as well as studies dealing with the energy budget of earthquakes or events that may represent cases of end-member models of rupture dynamics, such as slow or super-shear earthquakes.
<adrien [dot] oth [at] ecgs [dot] lu>
<kmayeda [at] yahoo [dot] com>
Seismic hazard evaluations are important to local, regional, and national level policy and decision-making, regulation, and permitting for facility design and construction. Recent renewed interest in nuclear power generation has spurred updates to regional hazard models, particularly in the eastern North America (ENA) with the EPRI/NRC/DOE seismic source characterization and NGA East projects. The USGS national seismic hazard mapping program is planning to update the national model in 2012-13. This special session will focus on papers on recent and in progress updates to regional hazard models worldwide plus policy implications and communicating seismic hazard to the general public. The session will present aspects of recent updates in the seismic source characterization and improvements in methodology. Regional seismic hazard evaluation projects can benefit from sharing seismic hazard evaluation experiences from around the world.
<ccramer [at] memphis [dot] edu>
<rclee [at] lanl [dot] gov>
<mpetersen [at] usgs [dot] gov>
Seismic imaging is a powerful tool for geophysicists to probe the Earth's interior. The demand for higher resolution and broader range of applications is rapidly increasing. This session includes presentations on seismic imaging in various scales and application arenas, with special emphasis on recent advances and future directions. Examples include innovations and advances in 3D traveltime tomography, waveform tomography, receiver function mapping, surface wave inversion, and joint inversion of multiple geophysical observations. There will be case study papers using seismic imaging to solve real problems. Discussions on the pitfalls, limitations, and artifacts of common seismic imaging methods and potential remedies are also part of this session.
<youshun [at] mit [dot] edu>
<mbegnaud [at] lanl [dot] gov>
<pochengeophysics [at] gmail [dot] com>
Due to the increasing global demand for energy, more countries are planning to expand their existing nuclear power plant fleet or include nuclear power as an optional energy source. Seismic siting for nuclear power plants is of great interest to the seismic community because of perceptions related to the potential radionuclide release resulting from strong earthquakes and associated hazards. Adequately characterizing seismic hazard for nuclear power plants is extremely important to ensuring public safety. This session focuses on seismic siting studies for nuclear power plants, including seismic source characterization, seismic wave propagation, and site amplification, and provides an interactive forum for geoscientists from various disciplines to discuss seismic siting issues for nuclear power plants. Topics that will be covered include:
(1) Estimation of ground motion and its aleatory variability in stable continental regions.
(2) Characterization of seismic sources: estimation of maximum magnitude and recurrence and the treatment of uncertainty.
(3) Calculation of site response and its uncertainty.
(4) Engineering characterization of ground motion parameters, such as Cumulative Absolute Velocity.
(5) Seismic siting studies related to existing nuclear power plants and other critical facilities.
<yong [dot] li [at] nrc [dot] gov>
<laurel [dot] bauer [at] nrc [dot] gov>
A general session of oral and poster presentations pertaining to earthquake source inversion models, earthquake and exotic source seismicity, magnitudes, source mechanisms with theoretical highlights.
Thursday - 14 April
Acoustic sensors are being added to all the Transportable Array seismic sites in USArray. This new dataset presents a game-changing opportunity to study the seismo-acoustic wavefield in unprecedented detail. Coupled with this development, new local and regional scale seismo-acoustic deployments have recently been deployed to study the seismo-acoustic wavefield from different sources including volcanoes, explosions, and earthquakes. Based on these recent developments, this special session will focus on all aspects of seismo-acoustics including source physics, propagation studies, and unique data sets and analyses techniques.
<sarrowsmith [at] gmail [dot] com>
<rww [at] lanl [dot] gov>
<bstump [at] smu [dot] edu>
In the wake of the 2010 Haiti and Chilean earthquakes, significant strides have been made towards understanding the tectonics and associated hazards along strike-slip and convergent margins. Yet fundamental questions remain. The goal of this session is to bring together geological/geophysical results from a diverse set of studies with the hope of developing a more comprehensive understanding of the geology as well as the processes that drive deformation and geohazards across these regions. A broad set of sub-disciplines have contributed to this session (including those working in these regions on neotectonics, geodesy, heat-flow, geomorphology, marine/land-based geology, seismology, paleoseismology and geohazards).
<matth [at] ig [dot] utexas [dot] edu>
<sean [at] ig [dot] utexas [dot] edu>
Large, potentially devastating, earthquakes occur within the interiors of tectonic plates where geodetically measured surface strain rates are very low (e.g. the 1811-1812 M7-8 New Madrid and 2008 M7.9 Sichuan, China earthquakes). In some cases, the paleoseismic history may suggest a relatively high rate of strain accumulation, which, if at odds with geodetic studies, can obfuscate estimates of seismic hazard. For example, geodetic measurements in the central U.S. may indicate very little overall surface deformation, an observation that may be incompatible with the relatively high frequency of large earthquakes in the New Madrid seismic zone. In this session, a series of presentations will explore surface deformation in analogous intraplate regions, whether or not observations in the New Madrid seismic zone are compatible, how variable strain rates can be over the earthquake cycle, and if a steady build up of strain responsible for repeated large earthquakes can be localized and partially masked from surface observations.
<olboyd [at] usgs [dot] gov>
<ecalais [at] purdue [dot] edu>
<langbein [at] usgs [dot] gov>
The U.S. Geological Survey (USGS) seeks input from colleagues in academia and government at the start of the planning process for the next decade of investments in its natural hazards mission area. The USGS recently announced that it is realigning its senior leadership structure with the themes in its science strategy, one of which is hazards. As part of the realignment, teams of scientists are developing implementation plans for each strategic theme, and this listening session will inform the hazards team.
<jones [at] usgs [dot] gov>
<bholmes [at] usgs [dot] gov>
Cutting edge techniques in event location and source discrimination related to nuclear verification.