New data will help predict shaking experienced in earthquakes

The findings of a new paper published this week will help predict the shaking Wellington, New Zealand, can expect to experience in earthquakes and shed light on why the city saw so much damage from the 2016 Kaikoura quake.

The paper, by Master of Science student Alistair Stronach and Professor Tim Stern from the School of Geography, Environment, and Earth Sciences at Te Herenga Waka—Victoria University of Wellington, New Zealand, shows the thickness of soft sediments beneath Wellington city is up to two times greater than previously thought.

“When earthquake waves pass through layers of sediment—as opposed to basement rock—they increase in intensity and lead to more shaking. This can have a devastating effect on cities, even when earthquakes are located several hundred kilometers away,” Professor Stern said.

In the 2016 Kaikoura earthquake, strong waves were produced that got “trapped” in the sediment basin beneath Wellington and caused unexpected damage in the Pipitea and CentrePort area of the city, he said.

“Fortunately, no lives were lost but several high-rise buildings had to be demolished and the wharf at CentrePort was so badly damaged it was out of commission for months.”

The vulnerability of this area to seismic waves stems from both the depth of the sediment and the fact it is mostly reclaimed land.

Data from the research will be used in future supercomputer simulations to predict the shaking that may be expected in different areas of Wellington city.

“These simulations are vital in planning for building design and identifying parts of the city most vulnerable to intense shaking from both local and distant earthquakes,” Mr. Stronach said.

The research, funded by the Earthquake Commission and published in the New Zealand Journal of Geology and Geophysics, used high-precision measurements of the Earth’s gravity field to make a map of the sedimentary thickness beneath Wellington city.

Measurements were made with a state-of-the-art gravity meter, which can pinpoint gravity differences to one part in 100 million.

“We took measurements throughout Wellington’s central business district and along the outer hills of the city. We identified a maximum thickness of about 540m near the Wellington Regional Stadium, which is twice previous estimates,” Mr. Stronach said.

The research also mapped an extension of the recently discovered Aotea Fault as it passes from the harbor near Clyde Quay Wharf to under Waitangi Park, before heading south, roughly along the line of Kent Terrace.

“Based on our modeling, this fault has several splays—or limbs—across the lower slopes of Mt Victoria and shows up as a steep step in the basement rock beneath the Te Aro part of downtown Wellington,” Mr. Stronach said.

3D fault information improves alert accuracy for earthquake early warning

Three-dimensional fault models are generally more accurate than two-dimensional line models at sending ground shaking alerts to the correct areas as part of an earthquake early warning system, according to a new study.

The benefits of 3D fault models vary depending on the fault style (a strike-slip versus a reverse fault, for instance), whether the event is subduction or crustal earthquake, and the level of shaking that triggers the alert, according to Jessica Murray and colleagues at the U.S. Geological Survey.

They suggest 3D models would be an improvement over 2D models for an alert threshold of MMI 4.5, meaning that the alert would be triggered for shaking exceeding the “light” intensity category, where most people indoors would feel some shaking. In their study, 3D models also substantially improved alert accuracy for all subduction zone earthquakes at MMI 4.5 and MMI 2.5 (weak motion felt only by a few people) thresholds.

The study’s findings could be useful for earthquake early warning systems like the U.S. West Coast’s ShakeAlert, the researchers note in the Bulletin of the Seismological Society of America.

For now, ShakeAlert’s algorithms use seismic data to characterize an earthquake source as a point or line. But researchers are already looking at ways to incorporate 3D source information, gleaned from fault displacement data collected by Global Navigation Satellite Systems (GNSS), into ShakeAlert.

“The expectation has been that such information would improve alerting because it would offer a better characterization of large earthquake sources compared to a point source,” Murray explained. “This assumption has not been explored in terms of how a more realistic source characterization would translate to ground motion estimates, so that is one thing we set out to do.”

The researchers used synthetic data generated from hypothetical 3D and line sources in their study. While 3D source models were generally more accurate overall than line sources for alerting the correct regions, the improvement provided by 3D models was most pronounced for subduction interface earthquakes.  The researchers also uncovered some interesting outcomes within the more detailed set of findings. For instance, at the MMI 2.5 alert threshold, the outcomes for a strike-slip or reverse crustal earthquake are similar whether 3D or point source representations are used, as long as the location, magnitude, and depth to the top of the seismic rupture are well-known.

In this case, adding a 3D representation would not offer much of an advantage over a point source representation, the researchers say, although GNSS data could be useful in places with poor seismic station coverage or seismic data outages, as in the 2019 California Ridgecrest earthquake sequence.

Murray and colleagues also noted that if a line source representation is used, and the earthquake’s magnitude is calculated from the estimated length, an incorrect length can significantly diminish the alert region accuracy.

“Estimated ground motion depends both on earthquake magnitude and a user’s distance to the source, so it’s not too surprising that simultaneously changing both those parameters would have a strong influence on the alert outcomes,” said Murray.

When magnitude estimates made from line source models in ShakeAlert don’t match earthquake catalog magnitude, however, it might be reflecting “actual variations in stress drop, which would, in turn, affect shaking,” she added. “It is especially important to explore this topic further using data from real events, which is one focus of our ongoing work.”

German researchers study cosmic expansion using methods from many-body physics

It is almost always assumed in cosmological calculations that there is an even distribution of matter in the universe. This is because the calculations would be much too complicated if the position of every single star were to be included. In reality, the universe is not uniform: in some places, there are stars and planets, in others, there is just a void. Physicists Michael te Vrugt and Prof. Raphael Wittkowski from the Institute of Theoretical Physics and the Center for Soft Nanoscience (SoN) at the University of Münster in Germany have, together with physicist Dr. Sabine Hossenfelder from the Frankfurt Institute for Advanced Studies (FIAS), developed a new model for this problem. Their starting point was the Mori-Zwanzig formalism, a method for describing systems consisting of a large number of particles with a small number of measurands. A representation of the evolution of the universe over 13.77 billion years. The far left depicts the earliest moment we can now probe, when a period of "inflation" produced a burst of exponential growth in the universe. (Size is depicted by the vertical extent of the grid in this graphic.) For the next several billion years, the expansion of the universe gradually slowed down. More recently, the expansion has begun to speed up again. © NASA's Goddard Space Flight Center

The theory of general relativity developed by Albert Einstein is one of the most successful theories in modern physics. Two of the last five Nobel Prizes for Physics had associations with it: in 2017 for the measurement of gravitational waves, and in 2020 for the discovery of a black hole at the center of the Milky Way. One of the most important applications of the theory is in describing the cosmic expansion of the universe since the Big Bang. The speed of this expansion is determined by the amount of energy in the universe. In addition to the visible matter, it is above all the dark matter and dark energy which play a role here – at least, according to the Lambda-CDM model currently used in cosmology.

“Strictly speaking, it is mathematically wrong to include the mean value of the universe’s energy density in the equations of general relativity”, says Sabine Hossenfelder. The question is now how “bad” this mistake is. Some experts consider it to be irrelevant, others see in it the solution to the enigma of dark energy, whose physical nature is still unknown. Uneven distribution of the mass in the universe may affect the speed of cosmic expansion.

“The Mori-Zwanzig formalism is already being successfully used in many fields of research, from biophysics to particle physics,” says Raphael Wittkowski, “so it also offered a promising approach to this astrophysical problem.” The team generalized this formalism so that it could be applied to general relativity and, in doing so, derived a model for cosmic expansion while taking into consideration the uneven distribution of matter in the universe.

The model makes a concrete prediction for the effect of these so-called inhomogeneities on the speed of the expansion of the universe. This prediction deviates slightly from that given by the Lambda-CDM model and thus provides an opportunity to test the new model experimentally. “At present, the astronomical data are not precise enough to measure this deviation,” says Michael te Vrugt, “but the great progress made – for example, in the measurement of gravitational waves – gives us reason to hope that this will change. Also, the new variant of the Mori-Zwanzig formalism can be applied to other astrophysical problems – so the work is relevant not only to cosmology.”

Harvard Med researchers model future SARS-CoV-2 mutations; forecasts their ability to evade antibodies, vaccines

  • Since the study was completed, several of the predicted mutations appeared in omicron, the most recently identified SARS-CoV-2 variant, offering insight into how omicron might be able to escape immune defense generated by mRNA vaccines and monoclonal antibody treatments for COVID-19.
  • The researchers modeled their predictions of future mutations using a combination of variables, including rare mutations documented in immunocompromised patients, existing SARS-CoV-2 genotypes, and the virus’s current molecular structure and behavior.
  • Findings highlight the ability of SARS-CoV-2 to shape-shift, underscoring the likelihood of new variants that contain multiple high-risk mutations and are capable of evading antibody-based treatments and vaccines.
  • The study highlights the urgent need to help curb viral evolution and future mutations through mitigation measures and by ensuring global immunity through mass vaccination.

To predict the future evolutionary maneuvers of SARS-CoV-2, a research team led by investigators at Harvard Medical School has identified several likely mutations that would allow the virus to evade immune defenses, including natural immunity acquired through infection and developed from vaccination as well as antibody-based treatments.

The study, published Dec. 2 in Science as an accelerated publication for immediate release, was designed to gauge how SARS-CoV-2 might evolve as it continues to adapt to its human hosts and in doing so to help public health officials and scientists prepare for future mutations.

Indeed, as the research was nearing publication, a new variant of concern, dubbed omicron, entered the scene and was subsequently found to contain several of the antibody-evading mutations the researchers predicted in the newly published paper. As of Dec. 1, omicron has been identified in 25 countries in Africa, Asia, Australia, Europe, and North and South America, a list that is growing daily.

The researchers caution that the study findings are not directly applicable to omicron because how this specific variant behaves will depend on the interplay among its own unique set of mutations–at least 30 in the viral spike protein—and on how it competes against other active strains circulating in populations around the world. Nonetheless, the researchers said, the study gives important clues about particular areas of concern with omicron and also serves as a primer on other mutations that might appear in future variants.

“Our findings suggest that great caution is advised with omicron because these mutations have proven quite capable of evading monoclonal antibodies used to treat newly infected patients and antibodies derived from mRNA vaccines,” said study senior author Jonathan Abraham, assistant professor of microbiology in the Blavatnik Institute at HMS and an infectious disease specialist at Brigham and Women’s Hospital. The researchers did not study response to antibodies developed from non-mRNA vaccines.

The longer the virus continues to replicate in humans, Abraham noted, the more likely it is that it will continue to evolve novel mutations that develop new ways to spread in the face of existing natural immunity, vaccines, and treatments. That means that public health efforts to prevent the spread of the virus, including mass vaccinations worldwide as soon as possible, are crucial both to prevent illness and to reduce opportunities for the virus to evolve, Abraham said.

The findings also highlight the importance of ongoing anticipatory research into the potential future evolution of not only SARS-CoV-2 but other pathogens as well, the researchers said.

“To get out of this pandemic, we need to stay ahead of this virus, as opposed to playing catch up,” said study co-lead author Katherine Nabel, a fifth-year student in the Harvard/MIT MD-PhD Program. “Our approach is unique in that instead of studying individual antibody mutations in isolation, we studied them as part of composite variants that contain many simultaneous mutations at once—we thought this might be where the virus was headed. Unfortunately, this seems to be the case with omicron.”

Many studies have looked at the mechanisms developed in newly dominant SARS-CoV-2 strains that enable the virus to resist the protective power of antibodies to prevent infection and serious disease.

This past summer, instead of waiting to see what the next new variant might bring, Abraham set out to determine how possible future mutations might impact the virus’s ability to infect cells and to evade immune defenses—work that he did in collaboration with colleagues from HMS, Brigham and Women’s Hospital, Massachusetts General Hospital, Harvard Pilgrim Health Care Institute, Harvard T.H. Chan School of Public Health, Boston University School of Medicine and National Emerging Infectious Diseases Laboratories (NEIDL), and AbbVie Bioresearch Center.

To estimate how the virus might transform itself next, the researchers followed clues in the chemical and physical structure of the virus and looked for rare mutations found in immunocompromised individuals and a global database of virus sequences. In lab-based studies using non-infectious virus-like particles, the researchers found combinations of multiple, complex mutations that would allow the virus to infect human cells while reducing or neutralizing the protective power of antibodies.

The researchers focused on a part of the coronavirus’s spike protein called the receptor-binding domain, which the virus uses to latch on to human cells. The spike protein allows the virus to enter human cells, where it initiates self-replication and, eventually, leads to infection. Most antibodies function by locking on to the same locations on the virus’s spike protein receptor-binding domain to block it from entering cells and causing infection.

Mutation and evolution are a normal part of a virus’s natural history. Every time a new copy of a virus is made, there’s a chance that a copy error—a genetic typo—might be introduced. As a virus encounters selective pressure from the host’s immune system, copy errors that allow the virus to avoid being blocked by existing antibodies have a better chance of surviving and continuing to replicate. Mutations that allow a virus to evade antibodies in this way are known as escape mutations.

The researchers demonstrated that the virus could develop large numbers of simultaneous escape mutations while retaining the ability to connect to the receptors it needs to infect a human cell. The team worked with so-called pseudotype viruses, lab-made stand-ins for a virus constructed by combining harmless, noninfectious virus-like particles with pieces of the SARS-CoV-2 spike protein containing the suspected escape mutations. The experiments showed that pseudo-type viruses containing up to seven of these escape mutations are more resistant to neutralization by therapeutic antibodies and serum from mRNA vaccine recipients.

This level of complex evolution had not been seen in widespread strains of the virus at the time the researchers began their experiments. But with the emergence of the omicron variant, this level of complex mutation in the receptor-binding domain is no longer hypothetical. The delta variant had only two mutations in its receptor-binding domain, and the pseudotypes Abraham’s team studied had up to seven mutations, omicron appears to have fifteen, including several of the specific mutations that his team analyzed.

In a series of experiments, the researchers performed biochemical assays to see how antibodies would bind to spike proteins containing escape mutations. Several of the mutations, including some of those found in omicron, enabled the pseudotypes to completely evade therapeutic antibodies, including those found in monoclonal antibody cocktail therapies.

The researchers also found one antibody that was able to neutralize all of the tested variants effectively. However, they also noted that the virus would be able to evade that antibody if the spike protein developed a single mutation that adds a sugar molecule at the location where the antibody binds to the virus. That, in essence, would prevent the antibody from doing its job.

The researchers noted that in rare instances, circulating strains of SARS-CoV-2 have been found to gain this mutation. When this happens, it is likely the result of selective pressure from the immune system, the researchers said. Understanding the role of this rare mutation, they added, is critical to being better prepared before it emerges as part of dominant strains.

While the researchers did not directly study the pseudotype virus’s ability to escape immunity from natural infection, findings from the team’s previous work with variants carrying fewer mutations suggest that these newer, highly mutated variants would also adeptly evade antibodies acquired through natural infection.

In another experiment, the pseudotypes were exposed to blood serum from individuals who had received an mRNA vaccine. For some of the highly mutated variants, serum from single-dose vaccine recipients completely lost the ability to neutralize the virus. In samples taken from people who had received a second dose of vaccine, the vaccine retained at least some effectiveness against all variants, including some extensively mutated pseudotypes.

The researchers note that their analysis suggests that repeated immunization even with the original spike protein antigen may be critical to countering highly mutated SARS-CoV-2 spike protein variants.

“This virus is a shape-shifter,” Abraham said. “The great structural flexibility we saw in the SARS-CoV-2 spike protein suggests that omicron is not likely to be the end of the story for this virus.”

University of Toronto study shows a nearly $1 million productivity boost for some manufacturers' predictive analytics investments

The predictive analytics industry is slated to earn more than $273 billion in 2022. Yet, despite the hype over big data and the forecasting power of tools such as statistical modeling and machine learning, not all firms that sink money into them reap benefits, prompting a research team to probe what makes the difference. Kristina McElheran is an assistant Professor of Strategic Management at the University of Toronto, Scarborough and Rotman School of Management. Her research centres on the use of information technology and data by firms, with an emphasis on strategy, organizational design, and process innovation. Her current focus is on data-driven decision making and how firms and individuals can use data to improve their performance. She is also actively investigating the economic and strategic impacts of Cloud Computing. Her experience includes six years on faculty at the Harvard Business School. She is a Faculty Affiliate at UofT’s Schwartz Reisman Institute for Technology and Society; Digital Fellow at the Digital Economy Lab, Stanford Institute for Human-Centered AI; Visiting Researcher at Harvard Law School on AI, Robotics, and the Future of Work; Fellow at Boston University’s Technology and Policy Research Initiative; and Digital Fellow at MIT’s Initiative on the Digital Economy. Prior to her academic career, she worked for two early-stage technology ventures in Silicon Valley. She currently serves as a Lab Economist at the Creative Destruction Lab, one of Toronto’s premier seed-stage programs for technology startups.

They found that significant and complementary investments in IT capital, an educated workforce, and high-efficiency manufacturing processes were “indispensable” to getting the most out of predictive tools that help firms optimize their performance. Among the 30,000 manufacturers surveyed in the 2015 study, companies with predictive analytics averaged about a $500,000 to $1 million revenue increase. Firms that did not make at least one of these, mutually-reinforcing investments, however, saw little to no benefit. 

“These complements provide the organizational infrastructure to collect, analyze, and respond to predictions based on objective data,” explains Kristina McElheran, an assistant professor of strategic management at the University of Toronto Scarborough and UofT’s Rotman School of Management.  

“IT capital captures investments in data collection and computer hardware that can transmit, store, and analyze data, for example. Educated workers are known to be an essential ingredient for that system. And certain production environments provide richer data due to the processes they use.” 

Prof. McElheran and her co-authors worked with the U.S. Census Bureau to create a survey that was returned by a highly-representative sample of U.S. manufacturing plants for the two survey years, 2010 and 2015. The survey asked about manufacturers’ use of predictive analytics, management practices, availability and use of data in decision-making, and design of their production processes. Results were cross-linked with related data such as company production inputs and outputs. Manufacturers were targeted because they tend to be early innovation adopters. More than three-quarters of responding plants had adopted some form of predictive analytics by 2010, researchers found, although most firms used the tools only annually or monthly. Higher intensity of use was associated with greater productivity gains. 

Government requirements for collecting environmental and safety data also helped to “nudge” some firms into adopting predictive analytics by pushing them to implement necessary infrastructure and train workers to use it. Companies nudged in this way ultimately displayed stronger performance in the researchers’ findings. 

It’s no secret in the management world that IT investments realize better returns when supported by educated workers, and vice versa. What the research shows is that some firms have not yet made that connection in the context of predictive analytics, says Prof. McElheran. 

“We found it puzzling,” she says. “More research is needed to understand the organizational or market frictions that are causing this apparent misalignment, one that is proving to be quite costly in the firms we observe.” 

This is the first study to examine the impact of predictive technologies on productivity in a large sample. The paper was co-written with Erik Brynjolfsson, of Stanford University and Wang Jin at the MIT Initiative on the Digital Economy.