Michigan Tech's Ghosh builds new models of cracking in brittle materials

The strength of teeth is told on the scale of millimeters. Porcelain smiles are kind of like ceramics -- except that while china plates shatter when smashed against each other, our teeth don't, and it's because they are full of defects.

Those defects are what inspired research led by Susanta Ghosh, assistant professor in the Department of Mechanical Engineering-Engineering Mechanics at Michigan Technological University. The work came out recently in the journal Mechanics of Materials. Along with a team of dedicated graduate students -- Upendra Yadav, Mark Coldren and Praveen Bulusu -- and fellow mechanical engineer Trisha Sain, Ghosh examined what's called the microarchitecture of brittle materials like glass and ceramics.

"Since the time of alchemists people have tried to create new materials," Ghosh said. "What they did was at the chemical level and we work at the microscale. Changing the geometries -- the microarchitecture -- of a material is a new paradigm and opens up many new possibilities because we're working with well-known materials." Susanta Ghosh

Glass is one such material. Making stronger glass brings us back to teeth -- and seashells. On the micro level, the primary hard and brittle components of teeth and shells have weak interfaces or defects. These interfaces are filled with soft polymers. As teeth gnash and shells bump, the soft spots cushion the hard plates, letting them slide past one another. Under further deformation, they get interlocked like hook-and-loop fasteners or Velcro, thus carrying huge loads. But while chewing, no one would be able to see the shape of a tooth change with the naked eye. The shifting microarchitecture happens on the scale of microns, and its interlocking structure rebounds until a sticky caramel or rogue popcorn kernel pushes the sliding plates to the breaking point. Ghosh and his team use several models to study how cracks form in glass; this animation reveals how a crack can be contained within a softer defect until it reaches the breaking point in the more brittle materials. Credit: Susanta Ghosh

That breaking point is what Ghosh studies. Researchers in the field have found in experiments that adding small defects to glass can increase the strength of the material 200 times over. That means that the soft defects slow down the failure, guiding the propagation of cracks, and increases the energy absorption in the brittle material. {module In-article}

"The failure process is irreversible and complicated because the architectures that trap the crack through a predetermined path can be curved and complex," Ghosh said. "The models we work with try to describe fracture propagation and the contact mechanics at the interface between two hard-brittle building blocks."

Ghosh's team developed two models. The first uses finite element modeling (FEM) and is detailed and highly accurate, but expensive. The second is surprisingly accurate, though less so than FEM techniques, and is much cheaper to calculate.

FEM is a numerical model that takes apart a complex whole by evaluating separate pieces -- called finite elements -- then puts everything back together again using the calculus of variations. Humpty Dumpty and all the king's men would have liked FEM, but it's no quick roadside trick. To run such complex calculations requires a supercomputer, like Superior at Michigan Tech, and ensuring that the right inputs get plugged in takes diligence, patience and a keen eye for coding detail. Using FEM for super strong glass means modeling all the possible interactions between the material's hard plates and soft spots. Analytical modeling offers an alternative.

"We wanted a simple, approximate model to describe the material," Ghosh said, explaining the team used more basic math equations than the FEM calculations to outline and describe the shapes within the material and how they might interact. "Of course, an experiment is the ultimate test, but more efficient modeling helps us speed up the development process and save money by focusing on materials that work well in the models."

Both the FEM and analytical microarchitecture modeling from Ghosh's lab can help make ceramics, biomedical implants and the glass in buildings as tough as our teeth.

Abbott's AI helps show doctors if patients are having a heart attack

New research from Abbott, published in the journal Circulation, found its algorithm could help doctors in hospital emergency rooms more accurately determine if someone is having a heart attack or not so that they can receive faster treatments or be safely discharged.

In this study, researchers from the U.S., Germany, U.K., Switzerland, Australia and New Zealand looked at more than 11,000 patients to determine if Abbott's technology developed using artificial intelligence (AI) could provide a faster, more accurate determination that someone is having a heart attack or not. The study found that the algorithm provided doctors a more comprehensive analysis of the probability that a patient was having a heart attack or not, particularly for those who entered the hospital within the first three hours of when their symptoms started.

"With machine learning technology, you can go from a one-size-fits-all approach for diagnosing heart attacks to an individualized and more precise risk assessment that looks at how all the variables interact at that moment in time," said Fred Apple, Ph.D., Hennepin HealthCare/ Hennepin County Medical Center, professor of Laboratory Medicine and Pathology at the University of Minnesota, and one of the study authors. "This could give doctors in the ER more personalized, timely and accurate information to determine if their patient is having a heart attack or not."

Removing the barriers for determining the presence of a heart attack

A team of physicians and statisticians at Abbott developed the algorithm* using AI tools to analyze extensive data sets and identify the variables most predictive for determining a cardiac event, such as age, sex and a person's specific troponin levels (using a high sensitivity troponin-I blood test) and blood sample timing.

Today, when a person enters the emergency room with symptoms of a heart attack, doctors often use a clinical assessment, an electrocardiogram (EKG) and troponin blood tests at set intervals to determine if the patient is having a heart attack or not. The algorithm is designed to help address two barriers that exist today for doctors looking for more individualized information when diagnosing heart attacks:

  • International guidelines for using high sensitive troponin tests currently do not always account for personal factors, such as age and sex, which could impact test results. For instance, women may not produce as much of the troponin protein as men and their heart attacks could go undiagnosed.
  • The guidelines also recommend that doctors carry out troponin testing at fixed times over a period of up to 12 hours. However, these time periods do not take into consideration a person's age or sex and puts a patient into a one-size-fits-all algorithm, rather than having an algorithm that accounts for factors specific to each person.

The algorithm used in the study takes into consideration the patient's age, sex and the dynamics of the troponin blood test results over time. Researchers found that when this information is combined through the power of computation, the algorithm has the potential to give doctors more confidence in the results to help rule out a heart attack and safely discharge that person or diagnose that a heart attack has occurred.

"As doctors are bombarded with data and information, this new algorithm takes several of these variables and uses computational power to more accurately provide a probability of that person having a heart attack," said Agim Beshiri, M.D., one of the inventors of the algorithm and senior medical director, global medical and scientific affairs, Diagnostics, Abbott. "In the future, you could imagine using this technology to develop algorithms that help doctors not only better determine if their patient is having a heart attack or not, but potentially before a heart attack occurs."

NASA Goddard scientist shows climate change's impact on fires worldwide

"Hot and dry" are the watchwords for large fires. In just seconds, a spark in hot and dry conditions can set off an inferno consuming thick, dried-out vegetation and almost everything else in its path. While every fire needs a spark to ignite and fuel to burn, hot and dry conditions in the atmosphere play a significant role in determining the likelihood of a fire starting, its intensity and the speed at which it spreads. Over the past several decades, as the world has increasingly warmed, so has its potential to burn.

Since 1880, the world has warmed by 1.9 degrees Fahrenheit, with the five warmest years on record occurring in the last five years. Since the 1980s, the wildfire season has lengthened across a quarter of the world's vegetated surface, and in some places like California, fire has become nearly a year-round risk. 2018 was California's worst wildfire season on record, on the heels of a devasting 2017 fire season. In 2019, wildfires have already burned 2.5 million acres in Alaska in an extreme fire season driven by high temperatures, which have also led to massive fires in Siberia.

Whether started naturally or by people, fires worldwide and the resulting smoke emissions and burned areas have been observed by NASA satellites from space for two decades. Combined with data collected and analyzed by scientists and forest managers on the ground, researchers at NASA, other U.S. agencies and universities are beginning to draw into focus the interplay between fires, climate and humans.

"Our ability to track fires in a concerted way over the last 20 years with satellite data has captured large-scale trends, such as increased fire activity, consistent with a warming climate in places like the western U.S., Canada and other parts of Northern Hemisphere forests where fuels are abundant," said Doug Morton, chief of the Biospheric Sciences Laboratory at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Where warming and drying climate has increased the risk of fires, we've seen an increase in burning." This visualization shows carbon emissions from fires from Jan. 1, 2003, through Dec. 31, 2018. The color bar reflects the quantity of carbon emitted.

A Hotter, Drier World {module In-article}

High temperatures and low humidity are two essential factors behind the rise in fire risk and activity, affecting fire behavior from its ignition to its spread. Even before a fire starts they set the stage, said Jim Randerson, an Earth system scientist at the University of California, Irvine who studies fires both in the field and with satellite data.

He and his colleagues studied the abundance of lightning strikes in the 2015 Alaskan fire season that burned a record 5.1 million acres. Lightning strikes are the main natural cause of fires. The researchers found an unusually high number of lightning strikes occurred, generated by the warmer temperatures that cause the atmosphere to create more convective systems -- thunderstorms -- which ultimately contributed to more burned area that year.

Hotter and drier conditions also set the stage for human-ignited fires. "In the Western U.S., people are accidentally igniting fires all the time," Randerson said. "But when we have a period of extreme weather, high temperatures, low humidity, then it's more likely that typical outdoor activity might lead to an accidental fire that quickly gets out of control and becomes a large wildfire."

For example, in 2018 sparks flying from hammering a concrete stake into the ground in 100-degree Fahrenheit heat and sparks from a car's tire rim scraping against the asphalt after a flat tire were the causes of California's devastatingly destructive Ranch and Carr Fires, respectively. These sparks quickly ignited the vegetation that was dried out and made extremely flammable by the same extreme heat and low humidity, which research also shows can contribute to a fire's rapid and uncontrollable spread, said Randerson. The same conditions make it more likely for agricultural fires to get out of control.

A warming world also has another consequence that may be contributing to fire conditions persisting over multiple days where they otherwise might not have in the past: higher nighttime temperatures.

"Warmer nighttime temperature allow fires to burn through the night and burn more intensely, and that allows fires to spread over multiple days where previously, cooler nighttime temperatures might have weakened or extinguished the fire after only one day," Morton said.

Climate Systems at Work

Hot and dry conditions that precede fires can be tempered by rain and moisture circulating in the atmosphere. On time scales of months to years, broader climate patterns move moisture and heat around the planet. Monitoring these systems with satellite observations allows researchers to be able to begin to develop supercomputer models for predicting whether an upcoming fire season in a given region will be light, average or extreme. The most important of these indicators are sea surface temperatures in the Pacific Ocean that govern the El Niño Southern Oscillation (ENSO).

"ENSO is a major driver of fire activity across multiple continents," Randerson said, who along with Morton and other researchers have studied the relationship between El Niño events and fire seasons in South America, Central America, parts of North America, Indonesia, Southeast Asia and equatorial Asia. "The precipitation both before the fire season and during the fire season can be predicted using sea surface temperatures that are measured by NASA and NOAA satellites."

An ongoing project, said Randerson, is to now extend that prediction capability globally to regions that are affected by other ocean-climate temperature changes and indicators.

The Human Factor

In studying the long-term trends of fires, human land management is as important to consider as any other factor. Globally, someplace on Earth is always on fire -- and most of those fires are set by people, either accidentally in wildlands, or on purpose, for example, to clear land or burn agricultural fields after the harvest to remove crop residues.

But not all fires behave the same way. Their behavior depends on the fuel type and the how people are changing the landscape. While fire activity has gotten worse in northern latitude forests, research conducted by Randerson and Morton has shown that despite climate conditions that favor fires, the number of fires in grassland and savanna ecosystems worldwide are declining, contributing to an overall decline in global burned area. The decline is due to an increased human presence creating new cropland and roads that serve as fire breaks and motivate the local population to fight these smaller fires, said Morton.

"Humans and climate together are really the dual factors that are shaping the fires around the world. It's not one or the other," Randerson said.

Fire Feedbacks

Fires impact humans and climate in return. For people, beyond the immediate loss of life and property, smoke is a serious health hazard when small soot particles enter the lungs, Long-term exposure has been linked to higher rates of respiratory and heart problems. Smoke plumes can travel for thousands of miles affecting air quality for people far downwind of the original fire. Fires also pose a threat to local water quality, and the loss of vegetation can lead to erosion and mudslides afterwards, which have been particularly bad in California, Randerson said.

For the climate, fires can directly and indirectly increase carbon emissions to the atmosphere. While they burn, fires release carbon stored in trees or in the soil. In some places like California or Alaska, additional carbon may be released as the dead trees decompose, a process that may take decades because dead trees will stand like ghosts in the forest, decaying slowly, said Morton. In addition to releasing carbon as they decompose, the dead trees no longer act as a carbon sink by pulling carbon dioxide out of the atmosphere. In some areas like Indonesia, Randerson and his colleagues have found that the radiocarbon age of carbon emissions from peat fires is about 800 years, which is then added to the greenhouse gases in that atmosphere that drive global warming. In Arctic and boreal forest ecosystems, fires burn organic carbon stored in the soils and hasten the melting of permafrost, which release methane, another greenhouse gas, when thawed.

Another area of active research is the mixed effect of particulates, or aerosols, in the atmosphere in regional climates due to fires, Randerson said. Aerosols can be dark like soot, often called black carbon, absorbing heat from sunlight while in the air, and when landing and darkening snow on the ground, accelerating its melt, which affects both local temperatures -- raising them since snow reflects sunlight away -- and the water cycle. But other aerosol particles can be light colored, reflecting sunlight and potentially having a cooling effect while they remain in the atmosphere. Whether dark or light, according to Randerson, aerosols from fires may also have an effect on clouds that make it harder for water droplets to form in the tropics, and thus reduce rainfall -- and increase drying.

Fires of all types reshape the landscape and the atmosphere in ways that can resonate for decades. Understanding both their immediate and long-term effects requires long-term global data sets that follow fires from their detection to mapping the scale of their burned area, to tracing smoke through the atmosphere and monitoring changes to rainfall patterns.

"As climate warms, we have an increasing frequency of extreme events. It's critical to monitor and understand extreme fires using satellite data so that we have the tools to successfully manage them in a warmer world," Randerson said.