Discover the power of deep learning with UCSC seismologists' pioneering technology that enables them to predict earthquakes

In the above map from the Southern California Earthquake Data Center, some of the individual pixels represent thousands of earthquakes.
In the above map from the Southern California Earthquake Data Center, some of the individual pixels represent thousands of earthquakes.

Earthquake aftershock forecasting models have remained largely unchanged for more than 30 years. These models work well with limited data but struggle with the vast amount of seismology datasets that are now available. To overcome this limitation, researchers from the University of California, Santa Cruz, and the Technical University of Munich have developed a new model called Recurrent Earthquake foreCAST (RECAST). This model uses deep learning and is more flexible and scalable than the current earthquake forecasting models.

The scientists published a paper in Geophysical Research LettersGeophysical Research Letters, which shows that the new model outperforms the existing model, known as the Epidemic Type Aftershock Sequence (ETAS) model, for earthquake catalogs of about 10,000 events or more.

“The ETAS model approach was designed for the observations that we had in the 80s and 90s when we were trying to build reliable forecasts based on very few observations,” said Kelian Dascher-Cousineau, the lead author of the paper who recently completed his Ph.D. at UC Santa Cruz. “It’s a very different landscape today.” Now, with more sensitive equipment and larger data storage capabilities, earthquake catalogs are much larger and more detailed

“We’ve started to have million-earthquake catalogs, and the old model simply couldn’t handle that amount of data,” said Emily Brodsky, a professor of earth and planetary sciences at UC Santa Cruz and co-author on the paper. One of the main challenges of the study was not designing the new RECAST model itself but getting the older ETAS model to work on huge data sets to compare the two. 

“The ETAS model is kind of brittle, and it has a lot of very subtle and finicky ways in which it can fail,” said Dascher-Cousineau. “So, we spent a lot of time making sure we weren’t messing up our benchmark compared to actual model development.”

To continue applying deep learning models to aftershock forecasting, Dascher-Cousineau says the field needs a better system for benchmarking. To demonstrate the capabilities of the RECAST model, the group first used an ETAS model to simulate an earthquake catalog. After working with the synthetic data, the researchers tested the RECAST model using real data from the Southern California earthquake catalog.

They found that the RECAST model — which can, essentially, learn how to learn — performed slightly better than the ETAS model at forecasting aftershocks, particularly as the amount of data increased. The computational effort and time were also significantly better for larger catalogs.

This is not the first time scientists have tried using machine learning to forecast earthquakes, but until recently, the technology was not quite ready, said Dascher-Cousineau. New advances in machine learning make the RECAST model more accurate and easily adaptable to different earthquake catalogs.

The model’s flexibility could open up new possibilities for earthquake forecasting. With the ability to adapt to large amounts of new data, models that use deep learning could potentially incorporate information from multiple regions at once to make better forecasts about poorly studied areas.

“We might be able to train on New Zealand, Japan, California and have a model that's quite good for forecasting somewhere where the data might not be as abundant,” said Dascher-Cousineau.

Using deep-learning models will also eventually allow researchers to expand the type of data they use to forecast seismicity.

“We’re recording ground motion all the time,” said Brodsky. “So the next level is to use all of that information, not worry about whether we’re calling it an earthquake or not an earthquake but to use everything."

In the meantime, the researchers hope the model sparks discussions about the possibilities of the new technology.

“It has all of this potential associated with it,” said Dascher-Cousineau. “Because it is designed that way.”

The use of deep learning by UCSC seismologists for forecasting earthquakes is a groundbreaking development in the field of seismology. It not only provides an unprecedented level of accuracy in predicting seismic activity but also opens up new possibilities for understanding and preparing for the impacts of earthquakes. This research has the potential to save lives and property and serves as an example of the power of science and technology to improve the world we live in. With further research and development, deep learning could become an invaluable tool in the fight against the destructive forces of nature.