Virginia Tech's Deshmukh wins grant to study hybrid materials with aid of AI

Sanket Deshmukh, assistant professor of chemical engineering in the Virginia Tech College of Engineering, has received a National Science Foundation CAREER Award to study and develop two-dimensional, hybrid nanosheets that combine thermo-sensitive polymers and metals. Sanket Deshmukh. Photo by Peter Means

Deshmukh sees wide-ranging potential in these nanosheets. As mechanically stable designs that can tune the distance between the nanoparticles within them, they could be used to make flexible electronics that don’t break when bent, he explained. And in the biomedical field, scientists could use them as nano filters to isolate and study-specific biomolecules.

Deshmukh will integrate emerging methods in machine learning and multiscale modeling into the development of nanosheets. Doing so puts his team in uncharted territory, he said, as these integrated computational techniques have been applied to several research areas within chemical engineering and materials science, but are relatively unexplored for soft materials like those he will develop.

“There is a lot of scope here," Deshmukh said. "Methods based in machine learning can not only accelerate the development of accurate computer models of these hybrid materials but can also unravel a number of unique phenomena and properties at the molecular level that is inaccessible to conventional analysis methods."

More broadly, Deshmukh’s lab studies existing materials and bio-materials at the molecular level and works from their insights to design new materials, by designing materials with new architectures and by integrating two different materials to create hybrid materials. There are countless changes to architectures or combinations that can be made when designing materials, Deshmukh explained, and computational predictions, produced with the aid of machine learning, can guide his lab’s choices.

“We develop and use these new approaches and tools to analyze our simulation trajectories of complex soft materials,” Deshmukh said. “You can use machine learning to predict new materials you might want to try creating. Instead of trying millions of different syntheses, you can narrow it down to hundreds.” 

Deshmukh’s NSF-funded research will enable his team to develop accurate and transferable coarse-grained models of thermo-sensitive polymers, metals, and solvents. These models will perform molecular dynamics simulations to study the materials’ self-assembly at the oil-water interface. Machine learning will help the researchers develop and validate the models with more robust and accurate predictions of interactions between the combined materials. 

In the course of their work, the researchers aim to specifically exploit the unique properties of thermosensitive polymers. These polymers change their conformations, or spatial arrangement, with changes in temperature. A majority of thermo-sensitive polymers are insoluble in water at a higher temperature, which is unusual among existing materials, Deshmukh said. They come out of water when heated up, shrinking in spacing as temperature rises and expanding as it falls. 

“For some of the polymers, the temperature at which they show this change in conformation is close to human body temperature,” Deshmukh said. “That’s why the scientific community has been interested in these polymers. We want to develop transferable computational models of these polymers, using the same set of interaction parameters when we are performing these simulations below and above their transition temperature, to reproduce their conformations.”

The team’s supercomputer experiments will focus on exploiting the way thermo-sensitive polymers, through their own conformations, can change the position of metal nanoparticles when the two are combined. As they go above or below the transition temperature, the researchers anticipate that the nanoparticles will respectively come closer or space out from one another, piggybacking off the polymers’ nature of shrinking in higher temperatures and expanding in lower ones.

Once the researchers are able to understand how the nanoparticles behave at the molecular level, it’s possible to form and fine-tune nanosheets from them with desired nanoparticle packing — and to control their mechanical properties for different uses, Deshmukh said, like the creation of flexible electronics.

A biomedical application Deshmukh also foresees: using these hybrid materials as nanofilters. With subtle differences in the sizes of various biomolecules, one could tune polymer spacing for selective filtration, enabling nanosheets to act as a kind of “gate,” he explained. That could help scientists in the laboratory setting, as they look to isolate and study certain biomolecules, like a single type of virus or protein.

The CAREER award is the National Science Foundation’s most prestigious award for early-career faculty with the potential to serve as academic role models in research and education and to lead advances in the mission of their organization, as stated by NSF. 

CAREER awardees must demonstrate efforts to integrate education and research and to conduct outreach. For the former, Deshmukh will focus on incorporating machine learning into the chemical engineering curriculum at the undergraduate and graduate levels, where he sees room for the knowledge of artificial intelligence to grow.

“In academia as well as industries, having such skills is important these days,” Deshmukh said. “I think the industry may prefer candidates who have this overall expertise.”

Stanford researchers develop a better way to track methane in the skies

When Stanford University graduate student Jeff Rutherford began his doctorate in 2018, the amount of methane entering the atmosphere from oil and gas extraction operations – mostly due to fracking – had become a major matter of contention. Tracking this harmful greenhouse gas falls to the Environmental Protection Agency. A drone sniffs for methane leaks in Colorado.  CREDIT Sean Boggs/Environmental Defense Fund

To help in their accounting, the EPA uses supercomputer models that take a “bottom-up” approach, counting the total number of wellheads, storage tanks, miles of pipeline, and other sources of methane, declaring an average annual release per component, and totaling everything up. They call it an “inventory.”

The only problem is that other organizations taking a “top-down” approach – using satellite imaging or atmospheric measurement to calculate the actual total methane emissions – were saying that the EPA was missing the mark by as much as half.

“Top-down approaches were finding total emissions double the EPA’s estimates, but the reason why was not clear,” Rutherford said of what motivated him and Adam Brandt, his advisor and a professor of energy resources engineering, to develop a new model. Their model attempts to bridge the gap between top-down and bottom-up approaches.

“If the emissions-based models that we use to make important climate-related decisions are not correct, it is a big problem,” said Brandt, who is also the director of Stanford’s Natural Gas Initiative.

Better data

Like the EPA, Brandt and Rutherford take a bottom-up approach but using the very latest component-level data to tabulate the true amount of methane more accurately. The data Rutherford and Brandt use in their model have been gathered by directly sampling at various components of the oil and gas industry where methane is most likely escaping — connectors, valves, and hatches on wellheads, storage tanks, etc.

“We use a very similar approach as the EPA, but with different underlying data,” said Rutherford. “The EPA’s inventory and their modeling are actually very good – the best there is. It took me two years of digging through it to understand and try to build on it.”

Results from the new model closely approximate what the top-down modelers have been saying: Current estimates are low. The Brandt and Rutherford model come within the margin of error of the top-down inventories.

One major source of these missing emissions, Rutherford said, is liquid storage tanks. Some emissions are intentional – such as “flashing,” in which dissolved methane under pressure escapes when the pressure is reduced. “It’s like opening a beer,” Rutherford analogized. “It’s liquid as long as there is high enough pressure, but if you release the pressure, the gas quickly escapes.” But much is due to operator errors, such as when a technician accidentally leaves a hatch open or separation equipment malfunctions. The combination of the two leads to very high emissions from storage tanks, although storage is only one component among many where Rutherford and Brandt point the finger.

The upshot of their new methane inventory is twofold, Brandt said. The first is awareness. It highlights a key shortfall in the current modeling that is used to make important environmental decisions and spotlights specific activities that should be targeted for remediation or regulation. Second, he said, the goal is not to replace existing models but to provide a useful baseline tool upon which to base modifications to those models to make future inventories more accurate.

To that end, Rutherford has been making the rounds talking to state and federal regulators as well as oil and gas producers about the findings of the new model and how they can best make use of the lessons learned.

“It is helpful simply to identify that there is a problem,” Rutherford said. “But, beyond that, our model offers up some clear actionable steps to improve our inventories and ways operators can adjust their practices that could really make a difference in reducing the amount of methane entering the skies.”

Stanford trained machine-learning algorithm on severe weather data leads to better predictions on flooding in the Midwest

A new machine learning approach helps scientists understand why extreme precipitation days in the Midwest are becoming more frequent. It could also help scientists better predict how these and other extreme weather events will change in the future. Graph shows frequency of U.S. Midwest extreme precipitation days and average U.S. Midwest precipitation during extreme precipitation atmospheric patterns from 1981 to 2019. (Image credit: Adapted from Davenport and Diffenbaugh, Geophysical Research Letters 2021)

From lake-draining drought in California to bridge-breaking floods in China, extreme weather is wreaking havoc. Preparing for weather extremes in a changing climate remains a challenge, however, because their causes are complex and their response to global warming is often not well understood. Now, Stanford researchers have developed a machine learning tool to identify conditions for extreme precipitation events in the Midwest, which account for over half of all major U.S. flood disasters. Published in Geophysical Research Letters, their approach is one of the first examples using AI to analyze causes of long-term changes in extreme events and could help make projections of such events more accurate.

“We know that flooding has been getting worse,” said study lead author Frances Davenport, a Ph.D. student in Earth system science in Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth). “Our goal was to understand why extreme precipitation is increasing, which in turn could lead to better predictions about future flooding.”

Among other impacts, global warming is expected to drive heavier rain and snowfall by creating a warmer atmosphere that can hold more moisture. Scientists hypothesize that climate change may affect precipitation in other ways, too, such as changing when and where storms occur. Revealing these impacts has remained difficult, however, in part because global climate models do not necessarily have the spatial resolution to model these regional extreme events.

“This new approach to leveraging machine learning techniques is opening new avenues in our understanding of the underlying causes of changing extremes,” said study co-author Noah Diffenbaugh, the Kara J Foundation Professor in the School of Earth, Energy & Environmental Sciences. “That could enable communities and decision-makers to better prepare for high-impact events, such as those that are so extreme that they fall outside of our historical experience.”

Davenport and Diffenbaugh focused on the upper Mississippi watershed and the eastern part of the Missouri watershed. The highly flood-prone region, which spans parts of nine states, has seen extreme precipitation days and major floods become more frequent in recent decades. The researchers started by using publicly available climate data to calculate the number of extreme precipitation days in the region from 1981 to 2019. Then they trained a machine-learning algorithm designed for analyzing grid data, such as images, to identify large-scale atmospheric circulation patterns associated with extreme precipitation (above the 95th percentile).

“The algorithm we use correctly identifies over 90 percent of the extreme precipitation days, which is higher than the performance of traditional statistical methods that we tested,” Davenport said.

The trained machine learning algorithm revealed that multiple factors are responsible for the recent increase in Midwest extreme precipitation. During the 21st century, the atmospheric pressure patterns that lead to extreme Midwest precipitation have become more frequent, increasing at a rate of about one additional day per year, although the researchers note that the changes are much weaker going back further in time to the 1980s.

However, the researchers found that when these atmospheric pressure patterns do occur, the amount of precipitation that results has clearly increased. As a result, days with these conditions are more likely to have extreme precipitation now than they did in the past. Davenport and Diffenbaugh also found that increases in the precipitation intensity on these days were associated with higher atmospheric moisture flow from the Gulf of Mexico into the Midwest, bringing the water necessary for heavy rainfall in the region.

The researchers hope to extend their approach to look at how these different factors will affect extreme precipitation in the future. They also envision redeploying the tool to focus on other regions and types of extreme events, and to analyze distinct extreme precipitation causes, such as weather fronts or tropical cyclones. These applications will help further parse climate change’s connections to extreme weather.

“While we focused on the Midwest initially, our approach can be applied to other regions and used to understand changes in extreme events more broadly,” said Davenport. “This will help society better prepare for the impacts of climate change.”