Simulating 800,000 years of earthquake historical past in California to exactly decide dangers – watts with that?

From the Texas Advanced Computing Center

Posted on January 25th, 2021 by Aaron Dubrow

A randomly selected 3,000 year segment of the physics-based simulated earthquake catalog in California created on Frontera. [Credit: Kevin Milner, University of Southern California]

Fortunately, massive earthquakes are rare occurrences. However, this information scarcity makes us in some ways blind to their risks, especially when it comes to determining the risk for a particular location or structure.

“We missed most of the possible events that could cause great damage,” said Kevin Milner, a computer scientist and seismological researcher at the University of Southern California’s Southern California Earthquake Center (SCEC).

“Using the example of southern California, we haven’t had a really big earthquake since 1857 – that was the last time southern San Andreas broke into a 7.9 magnitude earthquake. A San Andreas earthquake could affect a much larger area than the 1994 Northridge earthquake, and other large earthquakes can occur as well. We worry about that. “

The traditional way to get around this lack of data is to dig trenches to learn about past fractures, gather information from many earthquakes around the world, and build a statistical hazard model or use supercomputers to find a simulate a particular earthquake in a particular location with a high level of fidelity.

3D view of a particularly complex multiple fault fracture from the catalog for synthetic earthquakes. [Credit: Kevin Milner, University of Southern California]

However, a new framework for predicting the likelihood and impact of earthquakes across an entire region, developed over the past decade by a team of researchers affiliated with SCEC, has struck a middle ground and possibly a better way of determining risk.

A new study by Milner and Bruce Shaw of Columbia University, published in the Bulletin of the Seismological Society of America in January 2021, presents results from a prototype rate-state earthquake simulator (RSQSim) that simulates hundreds of thousands of years of earthquake history in California. Coupled with another code, CyberShake, the framework can calculate the amount of shaking that would occur with any quake. Their results can be easily compared with historical earthquakes and the results of other methods and show a realistic distribution of earthquake probabilities.

According to the developers, the new approach improves the ability to determine exactly how big an earthquake could be in a given location, and enables building code developers, architects, and civil engineers to design more resilient buildings that can survive earthquakes in a given location.

“For the first time we have an entire pipeline from start to finish where earthquakes and ground motion simulations are based on physics,” said Milner. “It can simulate up to 100,000 years on a really complicated fault system.”

APPLY MASSIVE COMPUTER POWER TO BIG PROBLEMS

RSQSim converts mathematical representations of earthquake geophysical forces – the standard model for the nucleation and propagation of fractures – into algorithms and then solves them on some of the most powerful supercomputers in the world. The computationally intensive research was made possible over several years by government-sponsored supercomputers at the Texas Advanced Computing Center, including Frontera – the world’s most powerful system at a university – Blue Waters at the National Center for Supercomputing Applications and Summit at the Oak Ridge Leadership Computing Facility.

“One way to better predict risk is through physics-based modeling, using the power of systems like Frontera to run simulations,” said Milner. “Instead of an empirical statistical distribution, we simulate the occurrence of earthquakes and the propagation of their waves.”

“We have made great strides at Frontera in determining what type of earthquake we can expect, when and how often, when it fails,” said Christine Goulet, executive director of applied science at SCEC, who was also involved on the work. “We don’t dictate the code or tell when the earthquakes will happen. We start a simulation of hundreds of thousands of years and just let the code transfer the burden from one bug to another. “

The simulations began with the geological topography of California and simulated over 800,000 virtual years how tensions form and dissipate when tectonic forces act on the earth. From these simulations, the framework created a catalog – a record that an earthquake occurred in a specific location with a specific strength and attributes at a specific time. The catalog that the SCEC team produced on Frontera and Blue Waters was among the largest ever made, Goulet said. The outputs from RSQSim were then fed into CyberShake, which in turn used computer models of geophysics to predict how much shaking (in terms of ground acceleration or speed and duration) would occur as a result of each quake.

“The framework outputs a complete slip-time history: where a break occurs and how it has grown,” explained Milner. “We found that it creates realistic ground motion, which tells us that the physics implemented in the model is working as intended.” Further work is planned to validate the results, which will be critical for design applications before adoption.

Read the full article here.

Like this:

To like Loading…

Comments are closed.