Japanese built AI can discover hidden physical laws in various data

Researchers at Kobe University and Osaka University in Japan have successfully developed artificial intelligence technology that can extract hidden equations of motion from regular observational data and create a model that is faithful to the laws of physics. Diagram explaining the developed artificial intelligence technology

This technology could enable us to discover the hidden equations of motion behind phenomena for which the laws were considered unexplainable. For example, it may be possible to use physics-based knowledge and simulations to examine ecosystem sustainability.

The research group consisted of Associate Professor YAGUCHI Takaharu and Ph.D. student CHEN Yuhan (Graduate School of System Informatics, Kobe University), and Associate Professor MATSUBARA Takashi (Graduate School of Engineering Science, Osaka University).

These research achievements were made public on December 6, 2021, and were presented at the Thirty-fifth Conference on Neural Information Processing Systems (NeurIPS2021), a prestigious meeting on artificial intelligence technologies. This research was among the top 3% selected for the spotlight category.

Main Points

  • Being able to model (formularize) physical phenomena using artificial intelligence could result in extremely precise, high-speed simulations.
  • In current methods using artificial intelligence, it is necessary to use transformed data that fits the equation of motion. Therefore it is difficult to apply artificial intelligence to actual observational data for which the equations of motion are unknown.
  • The research group used geometry to develop artificial intelligence that can find the hidden equation of motion in the supplied observational data (regardless of its format) and model it accordingly.
  • In the future, it may be possible to discover the hidden physical laws behind phenomena that had previously been considered to be incompatible with Newton’s Laws, such as ecosystem changes.
  • This will enable us to carry out investigations and simulations related to these phenomena using the laws of physics, which could reveal previously unknown properties.

Research Background
Ordinarily, predictions of physical phenomena are carried out via simulations using supercomputers. These simulations use mathematical models based on the laws of physics, however, if the model is not highly reliable then the results will also lack reliability. Therefore, it is essential to develop a method of producing highly reliable models from the observational data of phenomena. Furthermore, in recent years the range of physics applications has expanded beyond our predictions, and it has been demonstrated that it is possible to apply Newton’s Laws to other aspects, such as part of a model to show ecosystem changes. However, a concrete equation of motion has not yet been revealed in many cases.

Research Methodology
This research study developed a method of discovering novel equations of motion in observational data for phenomena that Newton’s Laws can be applied to. Previously, research has been conducted into discovering equations of motion from data, however, the prior method required the data to be in the appropriate format to fit its assumed special form of the equation of motion. However, there are many cases in a reality where it is not clear what data format is best to use, therefore it is difficult to apply realistic data.

In response to this, the researchers considered that the appropriate transformation of observational data is akin to coordinate transformation in geometry, thus resolving the issue by applying the geometric idea of coordinate transformation invariance found in physics. For this, it is necessary to illuminate the unknown geometric properties behind phenomena. The research team subsequently succeeded in developing AI that can find these geometric properties in data. If equations of motion can be extracted from data, then it will be possible to use these equations to create models and simulations that are faithful to physical laws.

Physics simulations are carried out in a wide range of fields, including weather forecasting, drug discovery, building analyses, and car design, but they usually require extensive calculations. However, if AI can learn from the data of specific phenomena and construct small-scale models using the proposed method, then this will simplify and speed up calculations that are faithful to the laws of physics. This will contribute to the development of the aforementioned fields. Furthermore, we can apply this method to aspects that at first glance appear to be unrelated to physics. If equations of motion can be extracted in such cases, this will make it possible to do physics knowledge-based investigations and simulations even for phenomena that have been considered impossible to explain using physics. For example, it may be possible to find a hidden equation of motion in animal population data that shows the change in the number of individuals. This could be used to investigate ecosystem sustainability by applying the appropriate physical laws (eg. the law of conservation of energy, etc.).

New supercomputer simulations identify widespread changes in climate variability under sustained anthropogenic forcing

There is growing public awareness that climate change will impact society not only through changes in mean temperatures and precipitation over the 21st century, but also in the occurrence of more pronounced extreme events, and more generally in natural variability in the Earth system. Extreme precipitation days per decade due to greenhouse warming over the 21st century. The first step in deriving the pattern shown is identifying the once-in-ten-year events of maximum precipitation over 2000-2009 for the 100 simulations. Here, this threshold is chosen as the lowest of the top 100 values of precipitation. For the second step, the number of days over 2090-2099 that exceed the threshold value is counted, and thereby while a value of 1 on the scale (units of days) means that there is no change in future, a value of 6 indicates 5 additional days of extreme precipitation in future. Note that the color scale saturates at 12 days to emphasize the response over land, given the very large amplitude over the eastern equatorial Pacific domain.

Such changes could also have large impacts on vulnerable ecosystems in both terrestrial and marine habitats. A scientific exploration of projected future changes in climate and ecosystem variability is described in a new study published in the journal Earth System Dynamics, representing the result of a broad collaborative partnership between the IBS Center for Climate Physics (ICCP) at Pusan National University in South Korea and the Community Earth System Model (CESM) project at the National Center for Atmospheric Research (NCAR) in the US.

The team conducted a set of 100 global Earth system model simulations over 1850-2100, working with a “business-as-usual” scenario for relatively strong emissions of greenhouse gases over the 21st century. The runs were given different initial conditions, and by virtue of the butterfly effect, they were able to represent a broad envelope of possible climate states over 1850-2100, enabling sophisticated analyses of changes in the variability of the Earth system over time. The nominally one-degree (~100km) resolution of the model, in conjunction with the 100-member set of runs, represented an unprecedented set of technical challenges that needed to be met before advancing to the goal of assessing how climate variability is impacted by sustained anthropogenic changes to the climate system. “We met these challenges by using the IBS/ICCP supercomputer Aleph, one of Korea’s fastest supercomputers,” says Dr. Sun-Seon Lee from the ICCP, a co-author of the study who ran the simulations together with her NCAR colleague Dr. Nan Rosenbloom. For the project, approximately 80 million hours of supercomputer time were used, and approximately 5 Petabytes of disc space (approximately 5000 normal hard discs) were required for storage of the model output.

The main finding of the study is that the impact of climate change is apparent in nearly all aspects of climate variability, ranging from temperature and precipitation extremes over land to increased number of fires in California, to changes in bloom amplitude for phytoplankton in the North Atlantic Ocean. Each of these changes has an important impact on sustainable resource management. As an example, occurrences of extreme precipitation events over the 21st century (between 2000-2009 and 2090-2099, see Fig. 1) indicate that extremes are expected to become more commonplace over many regions. These projected changes in precipitation extremes are in fact representative of the omnipresence of changes in extremes in the future across a broad range of climate and ecosystem variables, which has important implications for future adaptation strategies.

“In addition to large-scale changes in extreme events, our study also identified large-scale changes in the structure of the seasonal cycle over the 21st century, showing an enhanced growing season length over the continental regions north of 50°N”, says Dr. Keith Rodgers from the ICCP, first author of the study and a co-lead of the CESM2 Large Ensemble Project. Largely due to mean state warming and ensuing changes in the timing of the retreat and advance of winter snow cover, by the end of the 21st century growing season length is projected to increase by three weeks.

Taken together, the supercomputer simulations reveal that across our planet we can expect widespread changes in climate variability, ranging in timescales from synoptic storms to seasons to that of El Niño to decades. Dr. Gokhan Danabasoglu, a co-author of the study and a co-lead of the project, says “an important step moving forward will be to identify more fully the potential societal impacts and to communicate the implications for adaptation strategies.” This broader study has already motivated a number of more specialized scientific investigations using the tremendous volume of output from the simulations, spanning topics from marine ecosystem impacts to hydrological changes that affect water supply.

Virginia Tech research makes waves tackling the future of tsunami modeling

The coastal zone is home to over a billion people. Rising sea levels are already impacting coastal residents and aggravating existing coastal hazards, such as flooding during high tides and storm surges.

However, new research by Assistant Professor Tina Dura and Professor Robert Weiss in the College of Science's Department of Geosciences indicates that future sea-level rise will also have impacts on the heights of future tsunamis. a) Map of Alaska showing the sections of the Alaska-Aleutian subduction zone, earthquake boundaries, and approximate historical earthquake extents. b) Light gray shaded area shows the U.S. Geological Survey Science Application for Risk Reduction scenario magnitude 9.1 Semidi section earthquake. c) Map of the ports of Los Angeles and Long Beach showing the location of gauges that measure water levels at the ports and maximum nearshore tsunami heights. d) Plot showing modeled earthquake magnitudes in the year 2000 with no tidal variability included (blue histogram), with tidal variability (green histogram), and the combined tsunami heights and tidal variability (red histogram).

“In 50 to 70 years, sea level is going to be significantly higher around the world,” said Dura, who is also an affiliate of the Center for Coastal Studies, an arm of the Fralin Life Sciences Institute. “If a tsunami strikes in that time frame, the impacts that you're estimating for today are going to be greater. I think that coastal geologists and modelers alike need to consider the sea-level rise in future models and hazards assessments.”

Around the colloquial Ring of Fire, tectonic plates are colliding with the massive Pacific plate, resulting in seismic and volcanic activity. Because the Ring of Fire encircles the Pacific Ocean, large earthquakes on its boundaries produce regional tsunamis and also distant-source tsunamis that propagate across the Pacific Ocean and affect coastlines thousands of miles away.

Off the coast of Alaska, colliding tectonic plates create a 2,500-mile-long fault known as the Alaska-Aleutian subduction zone. Research shows that the subduction zone can produce distant-source tsunamis that strike the west coast of the United States, and in particular, Southern California.

In 2013, the United States Geological Survey initiated a Science Application for Risk Reduction project focused on a distant-source tsunami originating along the Alaska-Aleutian subduction zone and its impacts in California.

The project found that a magnitude 9.1 earthquake could produce a distant-source tsunami with an amplitude of 3.2 feet at the ports of Los Angeles and Long Beach, larger than any historical distant-source tsunami at the ports, causing losses of up to $4.2 billion.

However, due to rising sea levels, this tsunami scenario at the ports of Los Angeles and Long Beach will not be accurate in the long run.

Observations show that the world's temperatures are rising and sea levels are following suit. It's not a question of whether the sea level will continue to rise but by how much.

Dura and Weiss, along with colleagues from Rowan University, Rutgers University, Durham University, Nanyang Technological University, and the United States Geological Survey, joined forces to combine distant-source tsunami modeling with future sea-level rise projections to see how rising sea levels will influence tsunami heights in Southern California.

The group projected sea-level rise for the ports of Los Angeles and Long Beach-based on scenarios that factor in both low and high estimates of greenhouse gas emissions and climate change mitigation strategies. 

One scenario included mitigation strategies to reduce greenhouse gas emissions that resulted in minimal temperature and sea-level rise. Another scenario reflects a future with no mitigation efforts and high emissions, leading to a faster rise in temperatures and higher sea levels.

The group found that today, a magnitude 9.1 earthquake can produce tsunami heights that exceed 3.2 feet at the ports. However, by 2100, under high-emissions sea-level rise projections, a much smaller magnitude 8 earthquake will be able to produce a tsunami that exceeds 3.2 feet. 

In other words, higher sea levels will make the ports more vulnerable to tsunamis produced by less powerful earthquakes. The results are especially concerning given the higher frequency of magnitude 8 earthquakes.

“A 9.1 is very, very rare,” said Dura. “So today, the chances of having a tsunami exceeding 3.2 feet at the ports is pretty small because a very rare, very large earthquake would be required. But in 2100, a magnitude 8, which happens around the Pacific Rim quite often, will be able to exceed the same tsunami heights due to higher sea levels.”

“This work really illustrates the potential for future tsunamis to become far more destructive as sea levels rise, especially if we fail to reduce future greenhouse gas emissions,” said co-author Andra Garner, who is an assistant professor studying sea-level rise at Rowan University. “The good news is that the work also illustrates our ability to minimize future hazards if we act to limit future warming and the amount by which future sea levels increase.”

But knowing about these potentially devastating tsunamis entails not just looking ahead, but looking back as well.

The United States Geological Survey Science Application for Risk Reduction project only considered an earthquake that occurred within the Semidi section of the Alaska-Aleutian subduction zone. But since that initial work, Dura and colleagues have published research that suggests other sections of the subduction zone should be considered as well.

The Semidi section and the adjacent Kodiak section of the subduction zone have produced historical earthquakes. In 1938, a magnitude 8.3 earthquake struck the Semidi section. In 1964, a magnitude 9.2 - the largest recorded earthquake to occur on the Alaska-Aleutian subduction zone - struck the Kodiak section and other sections to the east.

Because the earthquakes of 1938 and 1964 did not overlap, seismic hazard maps labeled the area between them as a "persistent earthquake boundary." In other words, the risk of the region's greatest, multi-section earthquakes were thought to be quite low.

“Although the 1964 earthquake rupture did not cross into the rupture area of the 1938 earthquake, it is unclear if this has been the case for earthquakes hundreds to thousands of years in the past. Should this be considered a persistent boundary between earthquakes, or can there be very large, multi-section earthquakes in this region? We wanted to find out,” said Dura.

To learn more about the seismic history of the Alaska-Aleutian subduction zone, Dura and colleagues used 5-centimeter cookie-cutter-like cylinders to collect core samples from wetlands that are peppered across the proposed earthquake boundary.

The group then analyzed the soil layers contained in the cores to identify instances of land-level change and tsunami inundation from past earthquakes. Through radiocarbon, cesium, and lead dating, the group was able to build a timeline of past large earthquakes in the region.

Their research showed that multiple large earthquakes had spanned the proposed earthquake boundary, which means that earthquakes that ruptured both the Semidi and Kodiak sections of the subduction zone had occurred multiple times in the past.

“Our geologic data shows that earthquakes can span the Semidi and Kodiak sections,” said Dura. “For this reason, we incorporated both single and multi-section earthquakes into our distant-source tsunami modeling for the ports. By including multi-section earthquakes in our modeling, we believe the range of tsunami heights we estimate for the ports is a step forward in our understanding of impacts of future tsunamis there.”

The group’s data will be included in hazard maps for southern Alaska to help improve future modeling scenarios for the Alaska-Aleutian subduction zone.

“Collaborations like ours that aim to integrate coastal geology, earthquake modeling, and future projections of sea level are crucial in developing a complete picture of future tsunami impacts at ports,” said Weiss, director of the Center for Coastal Studies. "Increasing interdisciplinary research capacity, meaning the integration of scientific fields with each other that follow different governing paradigms, will be the key to understanding the impacts that the changing Earth has on our well-being and prosperity. Building interdisciplinary research teams is difficult, and Virginia Tech’s Center for Coastal Studies fulfills a pivotal role in bringing such teams together. Fulfilling this team-building role not only enables studies such as ours but also helps Virginia Tech remain true to its motto, Ut Prosim (That I May Serve).”

In future projects, Dura, Weiss, and colleagues plan to incorporate distant-source tsunamis originating from other subduction zones around the Ring of Fire into their modeling of tsunami impacts on other coasts as well as the economic consequences of coastal inundation.

“With our new study, we provide an important framework for incorporating sea-level rise into distant-source tsunami modeling, and we’re excited to continue building on these initial results,” Dura said.

Pandemic protection investment must be maintained against the next COVID

A leading UK scientist in the fight against COVID-19 has issued a rallying cry for investment and support against a potentially more deadly and economically devastating pandemic to come.

Professor Dame Sarah Roberts – a co-creator of the Oxford-AstraZeneca coronavirus vaccine – even suggested that nations should rank health security alongside their defense and intelligence budgets. Caption: Pandemic protection investment must be maintained against ‘the next COVID’

Her words at Monday night’s (December 6) 44th Richard Dimbleby Lecture, struck a chord with many who have championed greater investment in healthcare and pandemic protection and preparedness, citing huge strides made since the COVID-19 outbreak.

Philips’ healthcare division, for instance, believes a decade of normal healthcare progress was made just three months early in the pandemic.

But Dame Sarah warned that a future pandemic could be more contagious and lethal than COVID-19 and the world should not be complacent; the research pace that made earlier delivery of vaccines and other anti-virus measures must be maintained:

“We cannot allow a situation where we have gone through all we have gone through, and then find that the enormous economic losses we have sustained mean that there is still no funding for pandemic preparedness. The advances we have made, and the knowledge we have gained, must not be lost.” sec

She used the lecture to highlight the launch of a UK project to create a 100-day vaccine strategy against future pandemics; it is hunting for £3.5 billion of investment to enable pre-prepared vaccines and grow manufacturing capacity.

It aims to develop 100 prototype vaccines for the 25 viral families known to infect humans so that any new virus with pandemic potential could be met with a bespoke vaccine within 100 days.

Dame Sarah’s comments could not be timelier as COVID’s global death toll passes 5.25 million and case numbers reach 266 million, while the new omicron variant surges in numbers and global spread. As she observed: “This pandemic is not done with us.”

Paul Sheedy, Co-founder of the World Nano Foundation – a not-for-profit organization for commercializing nanoscale innovation – said: “It’s a rallying cry that governments and the investor community needed to hear, and from an eminent and well-qualified source.

“We have made so much progress, not only in fighting this pandemic, but others potentially lurking in the shadows, and we cannot afford to now sit back and allow pandemic protection and preparedness or overall healthcare to shrink back to pre-COVID levels of research and investment.

“Over 220 pathogens have emerged in the past 100 years with the potential to impact global healthcare, so we need universal vaccines and therapeutic solutions to stop these viruses finding hosts in the first place.”

This was echoed by Paul Stannard, Chairman of the innovative Luxembourg-based Vector Innovation Fund (VIF), which specializes in identifying and attracting investment into promising healthcare technology.

VIF launched with a Pandemic Protection Sub-Fund as well as a Future Healthcare Sub-Fund and Stannard said:

“We are seeing transformational innovations using nanomedicines as well as computational AI drug delivery.

“These can not only protect us from current and future COVID strains but deliver a better future for life sciences and move to a more de-centralized point-of-care health model using precision medicines and early intervention diagnostics.

“We are staggered by the speed of innovation coming through our investment pipeline, and this can create a much safer and fairer distribution of healthcare while delivering a more sustainable and economical global health model.”

UNH researchers find future snowmelt could have costly consequences on infrastructure

Climate change and warmer conditions have altered snow-driven extremes and previous studies predict less and slower snowmelt in the northern United States and Canada. However, mixed-phase precipitation—shifting between snow and rain—is increasing, especially in higher elevations, making it more challenging to predict future snowmelt, a dominant driver of severe flooding. Researchers at the University of New Hampshire took a closer look at previous studies, and because geographical areas respond differently to climate change, they found future snowmelt incidences could vary greatly by the late 21st century. Snowmelt could decrease over the continental U.S. and southern Canada but increase in Alaska and northern Canada resulting in larger flooding vulnerabilities and possibly causing major societal and economic consequences including costly infrastructure failures.

“Estimation of future floods can be a tricky business and yet it is important information for those planning future infrastructure,” said Jennifer Jacobs, professor of civil and environmental engineering. “For instance, if a region primarily has floods occurring during the winter, then this work could really help build infrastructure that can handle those future conditions. And, if the floods are decreasing, then the design values should also decrease rather than over design.”

Their study, recently published in the journal Geophysical Research Letters, looked at previous study predictions of change in the snowpack, snowmelt, and runoff to translate it into information that would be helpful for water resources managers, engineering designers, and the general public living in the areas of Northern California, Pacific Northwest, Alaska, and Canada. The researchers used historical maps and regional climate model (RCM) simulations that focused on North America. They found that in the West Coast mountain areas, such as Northern California and the Pacific Northwest, there could be a greater risk of rain-on-snow flooding because these areas are predicted to warm and produce more rain. This could increase the melting of any existing snowpack and lead to larger runoff potential, increasing flooding risk. But this differed in extremely cold regions like Alaska and northern Canada. Researchers found warmer temperatures in these areas could increase the opportunity for moisture that could likely lead to more winter precipitation like snow.

“These findings can be important in helping to develop or modify federal and state governments’ long-term policies for climate adaptation,” said Eunsang Cho, a former UNH doctoral student, now a postdoctoral researcher at NASA’s Goddard Space Flight Center, and lead author of the study. “For example, the current U.S. government standards for water-related infrastructure design are based on liquid precipitation data with very limited guidance on snow or snowmelt information.”

The researchers point out that certain infrastructure policies, like the relicensing of dams, depend on information about extreme weather conditions. This information can help engineers design infrastructure not based on past conditions but to anticipate future conditions. In their previous research, Jacobs and Cho created a map that accounts for snowmelt across the continental U.S. They say this information is already being used by the state of California in their relicensing process.