2016 Annual Meeting
The following 36 sessions are currently planned for the SSA 2016 Annual Meeting in Reno. Click any session name to view the schedule and abstracts for that session (to search for particular abstracts use the Abstracts Search page).
Faults generating very large earthquakes may appear as surface ruptures or may remain blind within the crust. This latter case was seen most recently with the Katmandu earthquake in Nepal, 2015. It is important to identify seismogenic faults and to understand their nature and kinematics. There are many faults that have been associated with historical large earthquakes. Whereas several other faults may be identified as having the potential to be reactivated in the recurrence interval of say 500-1000 years. Data generated through local and regional seismicity monitoring have helped in identifying many seismogenic faults. Paleoseismic studies can be used to constrain the amount of displacement on active faults, and in some cases, the recurrence interval between large events. Active faults may have varying geomorphic expressions in different tectonic environments. The mapping of active faults is also an important aspect. Identification of active faults and understanding their kinematic behavior vis-a-vis seismicity has direct societal relevance in earthquake hazard reduction. The session invites all recent findings in active tectonic regions that focus on the studies of active faults through seismicity, GPS, Remote Sensing, GIS, paleoseismology and tectonic geomorphology. Recently, a series of unusual earthquake phenomena is under investigation such as slow earthquakes; these are earthquake-like events that release energy over a period of hours to months, rather than seconds to minutes characteristic of a typical earthquake. This unusual earthquake phenomenon includes deep episodic tremor, low-frequency earthquakes, very-low-frequency earthquakes and slow slip events. Each of these has been demonstrated to arise from shear slip, just as do regular earthquakes, but with longer characteristic durations and radiating much less seismic energy. Slow seismic events may be useful for better understanding of the subduction process and large earthquake generation—especially in those regions where stress is accumulating but no major earthquake has triggered for a long time, and may be considered a ‘seismic gap’. Here, many seismologists speculate the potential of large earthquakes in such regions. In this session, we welcome papers on slow earthquakes.During more typical rupture events, various information on the kinematics and dynamics of earthquakes are now available. In many recent large earthquakes such information includes slip evolution, frictional properties during faulting, fault system geometry and the structure of fault zones. This provides the fundamental knowledge base with which to build an ‘earthquake mechanism’ capable of simulating the earthquake generation process and cycles of earthquake activity. In this session, we welcome papers earthquake simulation models as well as aiming to provide insight on the kinematics and dynamics of earthquake generation and cycles. Seismic waves generated by earthquakes have been interpreted to provide us information about the earth’s structure across a variety of scales. Seismic tomography is promising tool to understand various heterogeneous structures. In this session, we also welcome papers on earth’s evolving crust and upper mantle structure.
Sushil Kumar <email@example.com>
Sudhir Rajaure <firstname.lastname@example.org>
Rama Sushil <email@example.com>
To perform the best science, we need high-quality and complete datasets. While there are standards for network operation and station installations, there are improvements, often at the local network scale, happening all the time. This session will focus on sharing these local improvements to the wider audience. We encourage submissions featuring: 1) removing instrument response prior to data analysis; 2) running (and analyzing) regular calibrations network wide; 3) Vault/installation design aimed at reducing long period horizontal (tilt) noise; 4) Improvements or enhancements to station installations that improve data recovery; and 5) Lessons learned in network design and station installation: Case histories from the last few years.
Kristine Pankow <firstname.lastname@example.org>
David Wilson <email@example.com>
Recent advances in digital photogrammetry have pushed the bounds of the quality and quantity of surficial data that researchers can collect on a small budget. Using inexpensive or open source software, researchers can now quickly create high-resolution point clouds, surface models, and orthophotos for a given area of interest. This session addresses advances in earthquake science as a result of digital photogrammetric methods. We encourage abstracts focusing on innovative applications of structure-from-motion, pixel tracking, and other image-based techniques across all scales and tectonic settings. We also welcome studies addressing the accuracy and validity of these methods.
Nadine Reitman <firstname.lastname@example.org>
Kendra Johnson <email@example.com>
Lia Lajoie <firstname.lastname@example.org>
Site effects associated with near-surface geological conditions constitute an important part of any seismic hazard assessment. In situ measurements of shear-wave velocity (VS) are commonly used to evaluate the seismic response of sites under investigation. Traditional invasive approaches “directly” measure VS; however, these methods are often costly and (or) environmentally prohibitive. In comparison, non-invasive methods (using body- and surface-waves) are generally more cost-effective and by nature, considered environmentally friendly. Thus, the appeal and use of non-invasive methods has significantly increased in recent years. Non-invasive methods, however, do not directly measure VS and have inherent uncertainties. The purpose of this session is to facilitate discussions about the state-of-practice in the application of non-invasive approaches for characterizing seismic site conditions and disseminate recent advances. Topics of interest include, but are not limited to: evaluation of forward algorithms, in particular, for cases of velocity inversions; the effects of lateral velocity variation; non-uniform impedance contrasts (e.g., dipping or undulating bedrock subsurface, etc.); uncertainty analyses including effect(s) of data quality, and, the quantification and (or) propagation of data uncertainty; model parameterization; effective- and multi-mode analyses; non-uniqueness of inversions; advanced inversion methods (joint, 2- and 3-D, nonlinear, Monte Carlo, etc.). Presentations on applying passive seismic interferometry, and, on comparing different techniques on data from the same site and those integrating a variety of datasets are encouraged. The session conveners are members of a facilitation committee supported by the Consortium of Organizations for Strong Motion Observation Systems (COSMOS) to develop international guidelines for applying non-invasive geophysical techniques to characterize seismic site conditions. This Seismological Society of America session was developed by the COSMOS committee in coordination with conveners of the Site Effects special session of 35th General Assembly of the European Seismological Commission to be held from 5–9 September 2016 in Trieste, Italy.COSMOS Facilitation Committee for Development of the International Guidelines for the Application of Non-invasive Geophysical Techniques to Characterize Seismic Site Conditions: Yong, Alan (Chair); Molnar, Sheri; Askan, Aysegul; Xia, Jianghai; Cassidy, John; Lawrence, Martin; Parolai, Stefano; Bindi, Dino; Wotherspoon, Liam; Mucciarelli, Marco; Albarello, Dario; Steidl, Jamison; Nigbor, Robert; Stephenson, William; Foti, Sebastiano; Socco, Laura; Cornou, Cécile; Bard, Pierre-Yves; Hollender, Fabrice; Asten, Michael; Yilmaz, Öz; Crow, Heather; Matsushima, Shinichi; Yamanaka, Hiroaki.
Alan Yong <email@example.com>
Sheri Molnar <firstname.lastname@example.org>
Aysegul Askan <email@example.com>
Over the past few years, several large magnitude earthquakes struck Indonesia, Chile, and Japan as a result of a major subduction activity, which resulted in long duration excitations. Some stations in the 2011 Tohokou earthquake, for instance, recorded seismic activity and ground motions that lasted for almost 4 minutes (~up to 250 seconds). Severe structural damage was observed in the build environment and infrastructures due to such devastating events. The objective of this session is to present the latest research activities that focused on analyzing the structural damage and causes of failure associated with long duration earthquakes in different sectors of the build environment that include buildings, bridges, pipelines, power transmission and distribution lines...etc. Some topics of particular interest are: case studies and failure analysis of concrete, steel, and masonry structures damage during the recent Chile and Japan earthquakes; analytical and computational studies that utilized the published long duration records from recent subduction earthquakes; comparative structural performance of structures under short and long duration earthquakes. However, relevant work that considered application of long duration recorded or synthetic ground motions to structures performance and seismic behavior is also encouraged.
Mohamed Moustafa <firstname.lastname@example.org>
Accurate estimation of the state of stress and pore fluid pressure within a region is becoming of increased importance due to their role in earthquake nucleation and propagation. The main purpose of this session is to bring together researchers focused on the characterization of the stress field and its relevance to earthquake source physics, seismic hazard and/or reservoir geomechanics. Of particular interest will be to link estimations from seismology of the stress field and pore pressure with state-of the art modelling techniques. Studies from both natural and anthropogenic seismicity are encouraged. Contributions may regard (but are not limited to) studies of: 1) New methodologies to improve characterization of stress field orientation, stress magnitudes and pore pressure. 2) Analysis of the main factors perturbing the stress field (e.g. large earthquakes, anthropogenic activities, temperature changes), implications of these perturbations and their spatio-temporal extent. 3) How the main regional fault orientation with respect to the local/regional stress field affects the seismic hazard. 4) Stress heterogeneity at different scales and potential relation with earthquake source physics (e.g. stress drop, b-value). 5) Characterization of the stress field at global, regional (e.g. near large faults or subductions zones) and local scale (e.g. in relation to man-made perturbations). 6) The role of the background stress field and pore pressure for earthquake triggering via stress redistribution.
Patricia Martínez-Garzón <email@example.com>
Jeanne Hardebeck <firstname.lastname@example.org>
Marco Bohnhoff <email@example.com>
Karen Luttrell <firstname.lastname@example.org>
When earthquakes strike they can instantly focus millions of people’s attention on a single, common event. In times like these people are compelled to share their experiences and help others. Almost since the inception of seismology scientists have used these shared observations to advance earthquake science. The proliferate use of the Internet, social media, and mobile phone based applications has increased the access, speed, and volume of this information opening up new data sources for both rapid assessment of an earthquake's effects and for further in-depth study. Citizens provide information to seismologists directly by hosting seismometers and filling in online forms about first hand experiences, both are used to create derived information products. Also,devices such as inexpensive mems sensors, attached to desktop computers or in mobile devices, hold the possibility of greatly increasing monitoring coverage and public participation. Citizens also help seismologists indirectly when using social media and creating heavy traffic at earthquake websites which seismologists monitor to produce rapid earthquake detections and for gathering early reports of damage and impact. We invite contributors to share how they are tapping into citizen sourced data streams and integrating this information into their science both for real-time response and historical studies. We also encourage contributions on ways to better engage with citizens to improve our science and enable better use of earthquake information products.
Michelle Guy <email@example.com>
Remy Bossu <firstname.lastname@example.org>
Assessing and mapping seismic shaking hazards requires estimating site response, perhaps including potential soil-structure interaction. Whether there is an abundance or a scarcity of geostatistical and seismological analyses, the effects of deep and shallow structures add complexity to the task. This session pulls together a variety of innovative data sets and approaches that gain some control over the complexity.
John Louie <email@example.com>
About 25% of global seismicity happens below 70km, mostly within subducted slabs. Their rupture mechanisms under the high confining pressures and temperatures are still enigmatic. To make a significant progress, we need to understand both the source properties and the hosting slab environment (e.g., pressure, temperature, mineral phase). For example, what are the variations and controlling factors of deep rupture properties (e.g., rupture speed, stress drop)? Where do deep earthquakes nucleate/rupture with respect to the internal structure of the subducted slab? Does subducted slab provide the required ingredients for the proposed rupture mechanisms (e.g., dehydration embrittlement, transformational faulting, and shear thermal instability)? Seismic images of deep earthquakes and slab structures are continuously improving with new data/methods. In this session, we would like to bring these two aspects together to shed more light on deep earthquake physics. We welcome contributions about seismic observations, laboratory experiments, and numerical simulations related to deep earthquakes.
Zhongwen Zhan <firstname.lastname@example.org>
Meghan S. Miller <email@example.com>
Germán Prieto <firstname.lastname@example.org>
This session focuses on recent and future efforts to design and implement earthquake early warning systems worldwide. Submissions addressing the variety of earthquake detection algorithms, network design for robust data delivery, and outreach efforts are encouraged. Multi-hazard networks that address early ignition detection for wildfire and other monitoring systems such as extreme weather would also benefit this session.
Graham Kent <email@example.com>
Ken Smith <firstname.lastname@example.org>
Earthquake Source Parameters and Slip from Seismic, Geodetic and Laboratory Data: Theory, Observations and Interpretations
Understanding origin and spatio-temporal evolution of seismicity and deformation needs a careful quantitative analysis of earthquake source parameters for large sets of earthquakes in studied seismic sequences. Determining focal mechanisms, seismic moment tensors, slip distributions, radiated energy, aseismic deformation and other earthquake source properties and establishing their mutual relations can give an insight into tectonic stress and crustal strength in the area under study, material properties and prevailing fracturing mode (shear/tensile) in the focal zone, and other details of the earthquake source processes. Studying the relationship between static and dynamic parameters with earthquake size is essential to understanding the self-similarity of earthquakes and the scaling laws but also to help improve our ground motion prediction equations. The session aims to focus on methodological and observational aspects of estimating earthquake source parameters of natural or induced earthquakes on various scales from large to small earthquakes including laboratory experiments which can provide an opportunity to analyze seismic sources under partially controlled conditions. Presentations of new approaches to determining focal mechanisms, seismic moment tensors and other source parameters as well as interpretations of the source parameters for sets of earthquakes in case studies are welcome. Equally, contributions about self-similarity of earthquakes down to very small ruptures and about scaling relations for static and dynamic parameters of earthquakes are invited.
Vaclav Vavrycuk <email@example.com>
Grzegorz Kwiatek <firstname.lastname@example.org>
German Prieto <email@example.com>
The spatial correlation structure of earthquake ground-motions is widely recognized as a manifestation of the complexity of the wavefield and as exerting substantial influence on losses to distributed inventories (portfolios) of structures and infrastructure. Different strategies for representing and understanding spatial variability are employed---depending on needs and circumstances---and range from purely-empirical parametric analyses of intensity measures (IM¹s) to full-waveform numerical simulations. To date, forward calculations of site-to-site IM correlations needed for loss estimation and risk analyses are often (and presumably successfully) represented with empirical spatial correlation functions. As the sophistication of ground motion modeling increases, variability previously considered random becomes attributable to knowable source, path and site effects. In principle then, spatial-correlation of the stochastic component of the wavefield must be adjusted to be commensurate with the sophistication of the ground motion model, which is critically important for seismic hazard analysis. We encourage submissions focusing on empirical and numerical calibration of spatial variability, analyses of the physical factors that control variability, and strategies for including the effects of variability on loss calculations.
David Wald <firstname.lastname@example.org>
Kim Olsen <email@example.com>
Jack Baker <firstname.lastname@example.org>
Paolo Bazzurro <email@example.com>
This session seeks to explore how seismology can contribute to the question, "How close are we to an eruption?" As geophysical networks on volcanoes expand throughout the world, we are becoming better and better at detecting unrest. While useful in determining that the volcanic hazard has increased, the question of eruption timing still remains, and is most important to impacted populations and government officials. We invite submissions to this session that help shed light on new or repurposed techniques that may lead to a better sense of the "critical" state of a volcanic system and when it might erupt or escalate in activity.
Weston Thelen <firstname.lastname@example.org>
Matthew Haney <email@example.com>
In the past decades, induced and triggered seismicity associated with anthropogenic activity has attracted considerable interest among citizens and scientists in Europe and North America. Understanding this seismicity is a key challenge for many applications. There is evidence that hydraulic fracturing (HF) is a dominant driver for fluid-induced seismicity in some areas. Although the general, physical mechanisms of injection induced seismicity are known, the specific conditions that lead to earthquake triggering up to Mw 5.6 in case of waste water injection and ML 4.6 for HF are not as well understood. Seismicity is also observed in connection with geothermal energy production, waste water-injection, geothermal production, and carbon sequestration. Key questions include: - What controls the dynamics of induced earthquakes, the maximum magnitude and the total amount of seismic energy release? - How do they start, and are induced earthquakes different from tectonic slip processes? - What methods can be used to distinguish natural and induced earthquakes? The answers to these questions have implications for seismic hazard assessment and may help guide mitigation approaches. We invite numerical, observational and laboratory contributions from a variety of research areas such as geomechanical modeling, ground motion observations, source characterization studies, and seismic hazard assessment. Papers focusing on industry practices and public outreach are welcome to better inform the research community about recent developments.
Thomas Braun <firstname.lastname@example.org>
Ivan G. Wong <Ivan.Wong@AECOM.com>
Justin Rubinstein <email@example.com>
Thomas Goebel <firstname.lastname@example.org>
David Eaton <email@example.com>
Gail Atkinson <firstname.lastname@example.org>
Honn Kao <email@example.com>
There is now heightened concern from regulators, and the general public, over the real or perceived impact of unconventional oil and gas production operations on local seismicity. In Alberta and British Columbia, where fracking operations are believed to have caused a significant increase in the number of induced earthquakes, regulations have been introduced that requires Frac Operators to monitor all seismic activity and to report any local magnitude events greater than 2 and stop operations over 4. Many other jurisdictions – including Ohio, Australia and Holland - have either banned fracking or introduced new regulations that often maintain steep fines should oil and gas producers cause damage or fail to report potentially damaging seismic activities. It is important to understand the state of seismic induced monitoring technology to detect and monitor seismic activity vis-à-vis the regulatory requirements to monitor and report. Topics of interest to address include: (1) the various technologies to monitor induced seismicity; (2) methods to distinguish natural and induced earthquakes; (3) the current regulations to monitor and report seismic activity; (4) the current traffic light reporting system used in many jurisdictions (5) how to estimate magnitude, distance and resolution. We invite papers on all forms of induced seismicity monitoring, the advancing technologies and underlying regulatory requirements. Papers from industry and governmental agencies are particularly welcome to develop a better framework for seismic induced monitoring in the Oil and Gas Industry.
Iain Weir-Jones <firstname.lastname@example.org>
Steven Taylor <email@example.com>
The most significant earthquake in half a century in the Cook Inlet region of Alaska occurred on January 24, 2016. The Mw 7.1 earthquake occurred in the subducting Pacific Plate at a depth of 123km beneath the Iniskin Peninsula. The earthquake was felt throughout mainland Alaska and caused isolated damages in towns 100 km away on the Kenai Peninsula. Most damage was caused by slumping soils and a ruptured natural gas, line with minor damage attributed directly to the shaking. Moment tensor solutions indicate a minimum stress direction roughly parallel to slab dip. Aftershocks map out a near-vertical fault plane striking northeast. These orientations are consistent with normal faulting in the Pacific Plate, reflecting relative tension in the direction of slab dip. The earthquake occurred in a region of prodigious intermediate-depth seismicity where the slab bends to accommodate flat slab subduction immediately to the northeast. The aftershock region abuts a highly similar M6.4 earthquake that occurred six months prior, strongly suggesting a relationship. Finite fault modeling suggests slip of 2-3 m on a patch 20-30 km in diameter, consistent with the aftershock zone. Because of the depth and orientation, static displacements are largest 100 km east on the Kenai Peninsula. Instrumental and anecdotal reports of ground motion were highly variable and consistent with the directionality of the source. Felt reports are limited in the immediate vicinity of the epicenter, but seem to confirm the instrumental record of relatively modest ground motion directly above the source. The overlying Cook Inlet sedimentary basin profoundly influenced ground motions to the east and north of the epicenter. This is the strongest earthquake recorded to date by the cooperative free field and structural strong motion stations in Anchorage, 250km northeast of the epicenter. Recorded peak ground accelerations in Anchorage varied from 3-15% g across distances of just 1-2 km. In-structure accelerations exceeded 18% g. Several of the seismic stations closest to the source had been installed just a few months prior as part of the EarthScope USArray project. The earthquake provided a sensitivity test for new posthole instrumentation and installation methods.
Mike West <firstname.lastname@example.org>
This session focuses on the application of machine learning techniques to the analysis of seismic, infrasound, hydro-acoustic, remote sensing, electromagnetic, and other signals. The goals of machine learning applied to these signals include improved signal and event detection, phase identification, event discrimination, signal association, explosive yield estimation, and general signal characterization.
Timothy Draelos <email@example.com>
New geophysical approaches have provided unprecedented images of subseafloor fault systems. These data together will targeted sampling strategies employing AUVs and ROVs as well as more traditional shipboard coring efforts place important constraints on fault history. We encourage submissions that present new geological and geophysical data of marine fault systems, specifically new constraints on slip rates, most recent event (MRE), and recurrence interval. Despite recent efforts offshore, marine seismological hazards firstname.lastname@example.org poorly constrained and future efforts need to marshall our resources to understand better these marine fault systems and the attendant hazard.
Neal Driscoll <email@example.com>
To accurately predict far field observations used in explosion monitoring, it is necessary to first understand signal generation by explosion sources and then effectively relate those generated signals to remote/far-field observations. Incorporating near-field observations, comprehensive material properties, and subsurface imaging into first-principle calculations is critical to that first step. At the same time, testing of models by comparison with both near and far field predictions is necessary to refinement of the models, and validation of hypotheses concerning explosion signal generation. Quantification of uncertainties due to propagation is essential to understanding the constraints that far field observations can place on generation, and quantification of uncertainties due to generation, propagation, and measurement is essential to development of practical monitoring tools. We invite contributions from all research focus areas and technologies that meld efforts described above.
Catherine Snelson <firstname.lastname@example.org>
Christopher Bradley <email@example.com>
G. Eli Baker <firstname.lastname@example.org>
Multidisciplinary Studies of Earthquakes - Slow, Fast, and In Between: A Broad Range of Fault Behavior in Space and Time
Faults show a variety of motion over a range of spatiotemporal scale and frictional regime. It includes large damaging earthquakes in the seismogenic zone, slow earthquakes at the edges of the seismogenic zone, tsunamigenic earthquakes near the subduction trench and so on. How they influence and interact with each other, however, remain enigmatic. There are indications that slow slip precedes large damaging megathrust earthquakes in some cases. Regular fast earthquake results in changes in behavior of slow earthquakes. Repeated slow earthquakes load the updip seismogenic part that nucleates large destructive earthquakes. While these findings are compelling, the underlying physics is poorly understood. Controls of fault rheology on different modes of fault slip over a range of pressure and temperature and its affects on the seismic cycles are largely unknown. I invite studies aiming to understand the broad nature of fault slip and their implications on seismic hazard based on theory, observation, modeling, field and/or laboratory experiments. In-depth studies focusing on mechanism of diverse modes of fault slip including but not limited to slow slip, tremor, swarms, repeating earthquakes in all tectonic settings are welcome. I encourage holistic studies involving a wide spectrum of fault slip behavior.
Abhijit Ghosh <email@example.com>
Near Field and Directivity Considerations in Developing Fault Normal and Fault Parallel Spectra and Selecting and Scaling Time Histories for Nonlinear Analysis
Building codes for new and existing buildings require that for sites within 5 km of a fault (near field), both fault normal (FN) and fault parallel (FP) spectra are developed and time histories should be rotated to FN and FP components. However, besides the permissibility of taking maximum direction spectrum as the FN component and the geomean spectrum as the FP spectrum in ASCE 41-13, there are no particular guidelines how to develop the FN and FP spectra. In addition, codes require that for near field conditions, time histories shall be rotated to FN and FP components. However, there are three significant issues related to ground motions in near field situations: 1) Forward directivity; 2) ground motion orientation (FN and FP); and 3) velocity pulse effects. Forward directivity effects have been observed for distances up to 25 to 30 km and both velocity pulse effects and differences in FN and FP motions have been observed for distances greater than 5 km. In addition, it has been observed that FN motions may not represent the larger component as previously thought. In order to obtain true FN and FP motions, time histories should be rotated to major and minor principal axes. This session will entertain presentations and posters describing proposed procedures for developing FN and FP spectra and recent developments in incorporating near field and directivity effects into selecting and then scaling/spectra matching of time histories for nonlinear analysis.
Zia Zafir <firstname.lastname@example.org>
NGA-East: Research Results and Ground-Motion Characterization Products for Central and Eastern North America
The Next Generation Attenuation project for Central and Eastern North America (NGA-East) is a major multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER). NGA-East was tasked to develop a new ground-motion characterization (GMC) model for the Central and Eastern North-America (CENA) region. The GMC model consists in a set of ground motion models (GMMs, aka GMPEs) for median and standard deviation of ground motions and their associated weights, combined into logic-trees for use in probabilistic seismic hazard analyses (PSHA). NGA-East tackled the complex task of characterizing ground motions with a limited recording dataset. One key solution strategy was to rely on ground-motion simulations to supplement the available data. Important scientific issues were addressed through targeted research projects on topics such as the regionalization of seismic source, path and attenuation of motions, the treatment of variability and uncertainties and on the evaluation of site effects. Seven working groups were formed to cover the complexity and breadth of topics in the NGA-East project, each focused on a specific technical area. Through its multi-year span, NGA-East developed a comprehensive database of ground-motion records, numerous interim products for the quantification of ground motions in CENA as well as new GMMs. This session aims to cover the recent scientific advances and ground-motion products that emerged from NGA-East.
Christine Goulet <email@example.com>
Yousef Bozorgnia <firstname.lastname@example.org>
Continuous development of numerical modeling methodology in seismology is not only driven by emerging requirements in observational seismology (e.g., the advent of very dense seismic arrays;demand for near-real-time simulations; the multi-scale, multi-physics modeling of seismic phenomena; etc.), but also by developments in the mathematical sciences, and through the adaptation of methods originating in other scientific fields. Moreover, future methods for very large scale simulations will be increasingly influenced by (and may in turn influence) the evolution of computer architectures and programming models.This session is a forum for presenting advances in numerical methodology, whether the principal context is observational, mathematical/numerical, or computational. We invite contributions focused on development, verification and validation of numerical-modeling methods, and methodologically important applications, especially to earthquake ground motion and rupture dynamics (both single-event and event-sequence models). Contributions on the analysis of methods, fast algorithms, high-performance implementations, large-scale simulations, non-linear behavior, multi-scale problems, and confrontation of methods with data are especially encouraged.
Peter Moczo <email@example.com>
Steven Day <firstname.lastname@example.org>
Emmanuel Chaljub <Emmanuel.Chaljub@ujf-grenoble.fr>
Jozef Kristek <email@example.com>
Reliable estimations of past earthquake moment release and its projection into the future are important from a theoretical perspective as well as for immediate practical use in earthquake hazard assessments. A long-standing issue is the typically large discrepancy between geodetic moment rate estimations and observed rates of seismic moment release reported in various tectonic environments with high seismic activity. Another question that has received attention is whether the large moment release in the last decade is a statistically significant deviation from a long-term expectation. This session proposes a forum for an inter-disciplinary dialog among experts in geodesy, seismology, and statistics to revisit these issues and address the following questions: How reliable are geodetic moment rate estimates given the unknowns in the strain-to-moment rate conversion? How to construct robust projections of seismic moment release with different time spans (years to hundreds of years) and spatial resolution? How to relate the existing estimations of geodetic and seismic moment release in different regions? We also welcome theoretical and observational studies that address closely related problems: estimating the regional maximal magnitude, testing the hypothesis of clustering of the largest earthquakes, developing approaches to assess seismic coupling, and more.
Corné Kreemer <firstname.lastname@example.org>
Ilya Zaliapin <email@example.com>
Physical and Statistical Properties of Earthquake Swarms and Clustered Seismicity: Constraining Driving Mechanisms
Earthquake swarms, seismicity sequences clustered in space and time that lack a clear mainshock, are poorly understood phenomena. Swarms are less well studied than mainshock-aftershock sequences, perhaps due to moderate (~M5) maximum magnitudes that cause relatively little damage. However, swarms near population centers generate public interest in complex earthquake processes and potentially pose a higher short-term hazard. The recent surge in induced seismicity has ignited research on moderate earthquake swarms, with particular focus on fluid diffusion and migration as the driving mechanism. We invite contributions that characterize physical and statistical properties of earthquake swarms and clustered seismicity and address the following questions: What mechanisms control the size and occurrence of moderate earthquake swarms? What criteria identify fluid diffusion, or distinguish it from other mechanisms? Do the inherent physical and statistical properties of a swarm, and the earthquakes therein, vary with driving mechanism? Submissions from observational seismology, clustering and earthquake statistics, earthquake source physics, and earthquake triggering and interaction studies are welcome.
Christine Ruhl <firstname.lastname@example.org>
Ilya Zaliapin <email@example.com>
Rachel Abercrombie <firstname.lastname@example.org>
In response to the occurrence of relatively large (and felt) earthquakes that are potentially induced by man‐made activities, there is an increasing trend for the industry and government regulators to include a “traffic light” system in their decision‐making process. Despite its significant implications to the cost of operations and the protection of public safety, the protocol that defines the different scenarios for different lights (“green”, “yellow”, or “red”) has not been thoroughly validated to truly reflect the associated seismic risk. Most government regulators adopt a traffic light protocol (TLP) that depends on both local community reports and the magnitude of the earthquake, despite significant scatter and uncertainties for local magnitude calculations. Contributions are invited on all topics related to improving regulatory performance for induced seismicity, including hazard analysis, new innovations and mitigation measures.
David Eaton <email@example.com>
Honn Kao <firstname.lastname@example.org>
Gail Atkinson <Gmatkinson@aol.com>
In this session, we invite professionals working around risk management, or the science support teams thereof, to come and present what aspects of earthquake seismology they currently use, or wish they could get access to, or wish they had very quantitative estimates of. The academic community needs to learn from you when a qualitative picture is all you need, vs expected values at a point, vs probability density functions at a point, vs joint probabilities along lifelines or in a region. The properties of interest might be e.g., surface fault displacement during an event of some probability of occurrence, or surface deformation due to same event, or ground motion, or liquefaction, or hypocenter location. It would be helpful to set the context for the use of these properties, whether it is for decision-making procedures, or to assess needs for risk transfer. We welcome all underwriters, actuaries (or science support team thereof), investors (in catastrophe bonds), members of teams of urban resiliency officers, emergency managers, engineers, etc. We encourage the presentation of actual processes or case studies where a currently unmet earthquake science need was identified, or success stories where the need was met and integrated into a strategy or process. What format was used to transfer science information? We also welcome members of the academic community who want to present new datasets formerly not available, which they think could start playing a major role in any one of the many facets of risk management.
Delphine Fitzenz <email@example.com>
Nico Luco <firstname.lastname@example.org>
Resolving the spatial and temporal distribution of shallow fault slip is difficult since only the most near-field data are sensitive to shallow slip. Complicating these analyses is the fact that near-field data are rare: seismic and GPS station spacing is often too sparse, and interferograms tend to decorrelate near the rupture zone. But new datasets (including LiDAR, UAVSAR, new SAR satellites with shorter repeat times, and various types of optical imagery) are providing unprecedented, spatially dense, near-field observations. These new observations are allowing for a new generation of high-resolution shallow slip models which, in turn, are leading to new discoveries in the rheology and evolving deformation of fault zones. We invite presentations that look at innovative data sources and analyses (including methods for integrating near-field data that image the shallow parts of the fault with far-field data that are sensitive to deeper slip), advances in modeling methods, and case studies of shallow slip before, during, and after earthquakes.
Sarah Minson <email@example.com>
Benjamin Brooks <firstname.lastname@example.org>
Jessica Murray <email@example.com>
Secondary hazards, such as landslides and liquefaction, can result in significant losses but are not often included in hazard and loss assessments. Developing reliable models requires consistent documentation of event inventories and new advances in modeling techniques. General approaches to hazard modeling include physical, statistical, and heuristic models. Losses are typically accounted for though the use of fragility curves. The choice of modeling strategy is often determined by the application. For example, rapid response products may require different strategies than long term hazard assessments. The quality of inventories that are used for model development is highly variable both in terms of the resolution and attributes that are documented. This creates a challenge for developing and assessing models across many events. We encourage contributions on 1) hazard and loss modeling strategies, including probabilistic and event-specific analyses; 2) datasets, including both inventories and complementary data, such as estimates of seismic loading and geologic/climatic conditions; and 3) issues related to synthesizing inconsistent datasets.
Eric Thompson <firstname.lastname@example.org>
Kate Allstadt <email@example.com>
The region along and within the eastern side of the Sierra Nevada Mountains, from the Mojave Desert to southern Oregon, and extending a couple hundred kilometers into Nevada, is a prominent region of high hazards on the National Seismic Hazard Map. The region is characterized by distributed extensional and strike-slip faults. It experienced earthquakes with magnitudes ~7 in 1872, 1915, 1932, and 1954. Geodetic studies find that ~20-25% of the total deformation between the interiors of the North American and the Pacific plates take place in this region. Much has been learned about the seismic hazards of the region. However, the faults that accommodate the deformation may not all be identified, while slip rates and earthquake magnitudes on many known faults have large uncertainties. Some strain is partitioned through southern Nevada from the Walker Lane to the Wasatch fault system, but the mechanism and rate of this process, and its implication for hazards in Las Vegas is uncertain. Heterogeneous path effects due to the basin and range geology, together with highly variable site conditions, complicate ground motion prediction. Normal faults dip underneath the urban areas, including the spectacular Genoa - Mt. Rose system that dips beneath Carson City and Reno and the Frenchman Mountain fault that dips beneath Las Vegas, but the fault and basin geometries, expected ground motions, and other urban earthquake-related hazards, require more research. This session invites contributions related to all aspects of the seismic hazard of the region, including paleoseismology, seismicity, geodesy, and ground motions.
John Anderson <firstname.lastname@example.org>
John Louie <email@example.com>
Richard Koehler <firstname.lastname@example.org>
Corné Kreemer <email@example.com>
Wanda Taylor <firstname.lastname@example.org>
We invite any contribution that addresses the broad topic areas of seismo-acoustics or infrasound. This session intends to capture any research associated with atmospheric infrasound including sensors, signal analysis, propagation studies, and modeling in addition to any seismological research that isn’t capped at the free surface.
Stephen Arrowsmith <email@example.com>
Omar Marcillo <firstname.lastname@example.org>
In the past decade, the passage of the USArray across the United States and the drastic increase in induced seismicity in the mid-continent have provided unprecedented opportunities to understand the physical and seismotectonic state of the lithosphere. We hope to bring together researchers interested in the state of stress, crustal and upper mantle structure and the interaction between ancient features and modern strain, intraplate fault properties, and ultimately the geodynamic context of both natural and induced seismicity in continental interiors, both in North America and worldwide.
Will Levandowski <email@example.com>
Christine Powell <firstname.lastname@example.org>
Oliver Boyd <email@example.com>
Short- and Long-Term Deformation on Active Faults: Integrating Geodetic, Geologic and Seismic Constraints on Slip Rates and Off-fault Deformation in the Walker Lane and Beyond
Developing a complete and accurate understanding of the hazards posed by earthquakes increasingly relies on a framework that integrates data from geologic, seismic, and geodetic studies of faults. Each of these datasets has key strengths and weaknesses which can be addressed through comparison with the others. A growing body of evidence based on such comparisons suggests that some fraction of the plate boundary zone deformation budget is aseismic or occurs in a manner not recorded in the geologic or seismic record of slip on the trace major faults. Similarly, it is often unclear how geodetically-measured surface strain relates to earthquake potential. Thus these disagreements represent opportunities to gain a better understanding of the processes and strain budgets of earthquakes, and the modes of active crustal deformation. This session will focus on new constraints and analyses on patterns, style and rates of deformation that are the basis for comparison of fault behavior across the earthquake cycle, from strain accumulation to release. We welcome contributions on earthquake geology, paleoseismology, neotectonics, LiDAR, GPS, InSAR or other measurements that improve understanding of the behavior of active faults, especially integrative studies that shed new light on complete deformation budgets.
William Hammond <firstname.lastname@example.org>
Rich Briggs <email@example.com>
Rich Koehler <firstname.lastname@example.org>
Corné Kreemer <email@example.com>
Theoretical and Methodological Innovations for 3D/4D Seismic Imaging of Near-Surface, Crustal, and Global Scales
This session will focus on recent theoretical and methodological developments of seismic imaging and monitoring (i.e., time-resolved imaging) techniques to better understand the Earth´s structure on various scales. In previous years, imaging techniques have developed rapidly thanks to the advent of high-density networks, new modeling techniques, and unprecedented computation capacities. This includes, for example, seismic noise surface wave tomography, precise location of the sources of microseismic events, and full-wave inversion with active sources. However, significant problems, both well-known and lesser-known, remain. These include model resolution, uncertainty, repeatability, nonlinearity, and non-uniqueness. We invite novel approaches for solving common practical problems for 3D/4D imaging. To this regard, we welcome innovations and advances in physics-based imaging, 3D/4D tomography, waveform tomography, effects of structure outside the modeled region, uneven ray coverage, new migration techniques, advanced signal processing, multi-component seismic noise correlations, monitoring and locating of velocity changes, and joint inversion of multiple geophysical observations. Studies that compare real-Earth results obtained using different methods, and assess repeatability, are particularly encouraged as well as methods of unique seismic data acquisition and instrumental studies.
Marco Pilz <firstname.lastname@example.org>
Nori Nakata <email@example.com>
Tsunami Resilience Strategies: Application of Tsunami Science and Mitigation Advancements to Protect Communities
New techniques and technologies are being developed to mitigate the impacts of tsunami hazards. Evaluation of recent tsunamis have led to an influx in both new scientific data and improved community preparedness and planning activities. Availability of detailed geological and geophysical data sets from past and recent events has improved tsunami source characterization, numerical analyses, and hazard mapping. Advanced tsunami engineering, vulnerability, and risk analysis products are being utilized by emergency managers, maritime communities, land-use planners, and design and building code producers. The use of real-time tsunami hazard analysis products and decision making tools, alongside community preparedness, will improve life-safety worldwide. This session will focus on new tsunami hazard assessment information and the application of techniques and technologies into mitigation efforts and their effect on community resilience
Rick Wilson <Rick.Wilson@conservation.ca.gov>
Kevin Miller <Kevin.Miller@caloes.ca.gov>
Lori Dengler <Lori.Dengler@humboldt.edu>
Recent years have seen a proliferation of passive seismic methods for assessing velocities at upper-crustal depths of 0.1 to 10 kilometers. These new methods are starting to find application to the imaging of upper-crustal geologic structures. Two of these applications of passive imaging are becoming popular: 1) obtaining the shear-velocity structure of urban basins for prediction of earthquake shaking; and 2) imaging stratigraphy and structure of sedimentary basins containing fluid energy reserves such as oil, gas, and geothermal brine. In the earthquake-hazard application, SPAC and Deep ReMi surveys are adding to well-established surface-wave dispersion analyses of empirical Green’s functions (EGFs) derived from passive data recorded on monitoring networks. In the energy-exploration application, EGFs derived from passive recordings are augmenting the well-established virtual-source techniques that use EGFs derived from active-source records. Some passive exploration efforts are able to derive P-wave reflections from the EGFs, in addition to the standard surface-wave group velocities. For this session, we welcome contributions on passive-imaging theory, recording and data-processing techniques, or imaging results; focusing on upper-crustal depths of 0.1 to 10 kilometers. Case histories showing how passive imaging can improve hazard mapping, or compare to active-source imaging results, are particularly welcome.
John Louie <firstname.lastname@example.org>
Ileana Tibuleac <email@example.com>