For California residents, the threat of earthquakes is part of everyday life. While the state experiences small quakes on a daily basis, the threat of a catastrophic quake is always there. And the hardest part of this problem is that scientists can’t predict when and where an earthquake will strike.
They do, however, have the ability to help us prepare for quakes by developing early-warning systems, conducting earthquake preparedness drills, engineering buildings for earthquake resilience, and conducting scientific simulations to shed light on what will happen when the earth starts shaking.
This is where the Southern California Earthquake Center (SCEC) enters the picture. The SCEC, based at the University of Southern California, coordinates fundamental research on earthquake processes using Southern California as its principal natural laboratory. It supports core research and education in seismology, tectonic geodesy, earthquake geology and computational science.
“We study earthquakes — why they occur, how they occur and what kind of impact they have,” says Christine Goulet, Ph.D., Executive Director for Applied Science at SCEC. “We can’t predict earthquakes at this time, but we can prepare better if we know what to expect.”
For their scientific explorations, Dr. Goulet and her colleagues draw on the computational power of supercomputers to simulate what happens when geological faults rupture and the ground starts moving.
“We have a sense of how many large earthquakes will occur, and what their magnitudes will be, and so on, over the next several hundred years,” Dr. Goulet says. “But we don’t know exactly when they will occur and what they will look like and what they will do to our infrastructure. And that’s where we can conduct simulations to really help with this. We run simulations to better understand what happens with earthquakes over time and also in a specific location.”
The power of HPC
For their earthquake simulations, the researchers at the SCEC use some of our nation’s most powerful supercomputers. These high performance computing resources include multiple systems from Dell Technologies, including the Frontera supercomputer at Texas Advanced Computing Center (TACC).
Fueled by a $60 million award from the National Science Foundation (NSF), Frontera debuted on the TOP500 list in June 2019 as the nation’s fastest academic supercomputer — with a peak-performance rating of 38.7 petaFLOPS.Dell Technologies provided the primary computing system for Frontera, based on Dell EMC PowerEdge C6420 servers. In all, the system has more than 8,000 two-socket nodes, more than 16,000 Intel® Xeon® Scalable Processors and 448,448 cores.
Dr. Goulet and her colleagues run two classes of simulations on Frontera — a big-picture long-range view and a near-term view. The big-picture view models earthquakes as they happen over hundreds of thousands of years, while the near-term view zeroes in on what happens in a single event.
“The only way we can achieve these types of simulations is by using high-performance computers,” Dr. Goulet says. “And a lot of our computational research at SCEC is driven by lots of researchers working together, and we need a fast turnaround to be able to discuss the results and improve them. So it’s important for us to have access to these high-performance computers, such as Frontera.”
To Learn More
For a closer look at SCEC, read the case study. Explore the capabilities of the Frontera supercomputer at Texas Advanced Computing Center, see the Dell Technologies case study “A New ‘Frontera’.” And to learn about other critical workloads running on the HPC systems at TACC, see the blog “Urgent Computing at the Texas Advanced Computing Center from Dell Technologies and Intel.”
Explore more HPC Solutions from Dell Technologies and Intel.
 TOP500 List, June 2019.