Dutch researchers produce the first image of hydrogen at the metal-to-metal hydride interface

University of Groningen physicists have visualized hydrogen at the titanium/titanium hydride interface using a transmission electron microscope. Using a new technique, they succeeded in visualizing both the metal and the hydrogen atoms in a single image, allowing them to test different theoretical models that describe the interface structure. The results were published on 31 January in the journal Science Advances.

To understand the properties of materials, it is often vital to observe their structure at an atomic resolution. Visualizing atoms using a transmission electron microscope (TEM) is possible; however, so far, no one has succeeded in producing proper images of both heavy atoms and the lightest one of all (hydrogen) together. This is exactly what the University of Groningen Professor of Nanostructured Materials Bart Kooi and his colleagues have done. They used a new TEM with capabilities that made it possible to produce images of both titanium and hydrogen atoms at the titanium/titanium hydride interface. This is a picture from the control room of the new TEM by Thermo Fisher Scientific at the University of Groningen, with Prof. Dr. Bart Kooi in the background.{module INSIDE STORY}

Hydrogen atoms

The resulting pictures show how columns of hydrogen atoms fill spaces between the titanium atoms, distorting the crystal structure. They occupy half of the spaces, something which was predicted earlier. 'In the 1980s, three different models were proposed for the position of hydrogen at the metal/metal hydride interface,' says Kooi. 'We were now able to see for ourselves which model was correct.'

To create the metal/metal hydride interface, Kooi and his colleagues started out with titanium crystals. Atomic hydrogen was then infused and penetrated the titanium in very thin wedges, forming tiny metal hydride crystals. 'In these wedges, the numbers of hydrogen and titanium atoms are the same,' Kooi explains. 'The penetration of hydrogen creates a high pressure inside the crystal. The very thin hydride plates cause hydrogen embrittlement in metals, for example inside nuclear reactors.' The pressure at the interface prevents the hydrogen from escaping.

Innovations

Producing images of the heavy titanium and the light hydrogen atoms at the interface was quite a challenge. First, the sample was loaded with hydrogen. It should subsequently be viewed in a specific orientation along with the interface. This was achieved by cutting properly aligned crystals from titanium using an ion beam and making the samples thinner - to a thickness of no more than 50 nm - again using an ion beam.

The visualization of both titanium and hydrogen atoms was made possible by several innovations that were included in the TEM. Heavy atoms can be visualized by the scattering that they cause of the electrons in the microscope beam. Scattered electrons are preferably detected using high-angle detectors. 'Hydrogen is too light to cause this scattering, so for these atoms, we have to rely on constructing the image from low-angle scattering, which includes electron waves.' However, the material causes interference of these waves, which has so far made the identification of hydrogen atoms almost impossible.

Supercomputer simulations

The waves are detected by a low-angle bright-field detector. The new microscope has a circular bright-field detector that is divided into four segments. By analyzing differences in the wavefronts detected in opposing segments and looking at the changes that occur when the scanning beam crosses the material, it is possible to filter out the interferences and visualize the very light hydrogen atoms.

'The first requirement is to have a microscope that can scan with an electron beam that is smaller than the distance between the atoms. It is subsequently the combination of the segmented bright-field detector and the analytical software that makes visualization possible,' explains Kooi, who worked in close collaboration with scientists from the microscope's manufacturer, Thermo Fisher Scientific, two of whom are co-authors on the paper. Kooi's group added various noise filters to the software and tested them. They also performed extensive supercomputer simulations, against which they compared the experimental images.

Nanomaterials

The study shows the interaction between the hydrogen and the metal, which is useful knowledge for the study of materials capable of storing hydrogen. 'Metal hydrides can store more hydrogen per volume than liquid hydrogen.' Furthermore, the techniques used to visualize the hydrogen could also be applied to other light atoms, such as oxygen, nitrogen or boron, which are important in many nanomaterials. 'Being able to see light atoms next to heavy ones opens up all kinds of opportunities.'

Ultra-high energy events key to study of ghost particles

With 'Zee burst,' physicists propose new resonance beyond the standard model

Physicists at Washington University in St. Louis have proposed a way to use data from ultra-high energy neutrinos to study interactions beyond the standard model of particle physics. The 'Zee burst' model leverages new data from large neutrino telescopes such as the IceCube Neutrino Observatory in Antarctica and its future extensions.

"Neutrinos continue to intrigue us and stretch our imagination. These 'ghost particles' are the least understood in the standard model, but they hold the key to what lies beyond," said Bhupal Dev, assistant professor of physics in Arts & Sciences and author of a new study in Physical Review Letters.

"So far, all nonstandard interaction studies at IceCube have focused only on the low-energy atmospheric neutrino data," said Dev, who is part of Washington University's McDonnell Center for the Space Sciences. "The 'Zee burst' mechanism provides a new tool to probe nonstandard interactions using the ultra-high energy neutrinos at IceCube." {module INSIDE STORY} Physicists in Arts & Sciences have proposed a new way to leverage data from large neutrino telescopes such as the IceCube Neutrino Observatory in Antarctica. (Photo: Felipe Pedreros/IceCube and National Science Foundation)

Ultra-high energy events

Since the discovery of neutrino oscillations two decades ago, which earned the 2015 Nobel Prize in physics, scientists have made significant progress in understanding neutrino properties -- but a lot of questions remain unanswered.

For example, the fact that neutrinos have such a tiny mass already requires scientists to consider theories beyond the standard model. In such theories, "neutrinos could have new nonstandard interactions with the matter as they propagate through it, which will crucially affect their future precision measurements," Dev said.

In 2012, the IceCube collaboration reported the first observation of ultra-high energy neutrinos from extraterrestrial sources, which opened a new window to study neutrino properties at the highest possible energies. Since that discovery, IceCube has reported about 100 such ultra-high energy neutrino events.

"We immediately realized that this could give us a new way to look for exotic particles, like supersymmetric partners and heavy decaying dark matter," Dev said. For the previous several years, he had been looking for ways to find signals of new physics at different energy scales and had co-authored half a dozen papers studying the possibilities.

"The common strategy I followed in all these works was to look for anomalous features in the observed event spectrum, which could then be interpreted as a possible sign of new physics," he said.

The most spectacular feature would be a resonance: what physicists witness as a dramatic enhancement of events in a narrow energy window. Dev devoted his time to think about new scenarios that could give rise to such a resonance feature. That's where the idea for the current work came from.

In the standard model, ultra-high energy neutrinos can produce a W-boson at resonance. This process, known as the Glashow resonance, has already been seen at IceCube, according to preliminary results presented at the Neutrino 2018 conference.

"We propose that similar resonance features can be induced due to new light, charged particles, which provides a new way to probe nonstandard neutrino interactions," Dev said.

Bursting onto the neutrino scene

Dev and his co-author Kaladi Babu at Oklahoma State University considered the Zee model, a popular model of radiative neutrino mass generation, as a prototype for their study. This model allows for charged scalars to be as light as 100 times the proton mass.

"These light charged Zee-scalars could give rise to a Glashow-like resonance feature in the ultra-high energy neutrino event spectrum at the IceCube Neutrino Observatory," Dev said.

Because the new resonance involves charged scalars in the Zee model, they decided to call it the 'Zee burst.'

Yicong Sui at Washington University and Sudip Jana at Oklahoma State, both graduate students in physics and co-authors of this study, did extensive event simulations and data analysis shows that it is possible to detect such a new resonance using IceCube data.

"We need an effective exposure time of at least four times the current exposure to be sensitive enough to detect the new resonance -- so that would be about 30 years with the current IceCube design, but only three years of IceCube-Gen 2," Dev said, referring to the proposed next-generation extension of IceCube with 10 km3 detector volume.

"This is an effective way to look for the newly charged scalars at IceCube, complementary to direct searches for these particles at the Large Hadron Collider."

FSU researchers use engineering, supercomputing, forestry to understand prescribed burns

In the effort to mitigate destructive wildfires, wildland managers often fight those uncontrolled fires with prescribed fire -- carefully controlled burns to safely eliminate the vegetation that piles up on forest floors and adds to potential fuel.

Prescribed fires are an important tool for managing fire-prone landscapes, but they come with a cost. Fire makes smoke, which carries tiny, unburnt particles through the air, lowering air quality and making breathing more difficult.

A $2.2 million Department of Defense grant will fund an FSU investigation into the dynamics of smoke from prescribed burns, giving land managers a better understanding of when and how to best use the technique.

"When we understand how plumes are affected by key controls, such as the ignition pattern, this is one way that fire managers will be able to engineer plumes that have a less significant impact on communities," said Assistant Professor of Scientific Computing Bryan Quaife. CAPTION From left, Bryan Quaife, assistant professor of scientific computing, Rod Linn, of Los Alamos National Laboratory, Neda Yaghoobian, assistant professor of mechanical engineering at FAMU-FSU College of Engineering and Kevin Hiers, of the Tall Timbers Research Station & Land Conservancy, won a $2.2 million grant from the U.S. Department of Defense to study smoke plumes from prescribed fire.  CREDIT Mark Wallheiser / FAMU-FSU College of Engineering{module INSIDE STORY}

Understanding how smoke plumes develop and travel is an interdisciplinary problem. FSU researchers from the Department of Scientific Computing, the FAMU-FSU College of Engineering's Department of Mechanical Engineering and the Geophysical Fluid Dynamics Institute are partnering with the forest research station Tall Timbers, Los Alamos National Laboratory and others to understand the complexities of wildland fires.

Partnering with investigators who have fire management experience helps researchers take what they discover at an academic level and transition it to a practical application. Existing knowledge about how fires burn informs their model. They refine that with new parameters, such as the topography and distribution of vegetation that acts as fuel in a burn plot, the way wind moves through the plot, the fuel moisture and the heat radiated from the fire -- then collect data from an actual fire to make a more accurate model of how smoke plumes rise from a prescribed burn.

"We want our models to capture the true physics and our simulations to be as close to what really happens in the field as possible," said Neda Yaghoobian, an assistant professor of mechanical engineering at the FAMU-FSU College of Engineering. "This requires parameters that can take input from fire managers and other researchers to refine our model."

Fire managers using prescribed burning must balance competing interests when they decide when and where to use the tool. Smoke from fires lowers air quality. When a prescribed fire burns, managers must sometimes field complaints from people downwind, and they must be careful to ensure that smoke plumes don't carry embers that can create undesired spot fires.

But these deliberately set fires have benefits, including making uncontrolled wildfires less likely and removing invasive species. A small prescribed fire that creates little smoke is less likely to bother nearby residents, but it leaves a plot with a greater risk of wildfire in the future.

As more people live closer to woodlands, the stakes for controlling fires rise, and it becomes more challenging for fire managers to find a window to run a prescribed burn, Quaife said.

Many aspects of the procedure are out of the control of land managers, but one thing they can control is how they light the fire. For example, they can light a fire in a long line, or they can create several spot fires that burn toward each other. Researchers will examine how the different patterns of burning affect the smoke plume that is created, giving land managers a better understanding of where smoke might go.

One of the big challenges for this research is that a lot of the physics happens at very small scales. Researchers may be interested in how fire moves around a tree that is only a few inches across in a much larger plot of land. Multiply that by the many parameters they track, and the task becomes very complex. The most advanced fire simulations run on thousands of computer processors for several hours to predict a few minutes of a fire's behavior.

"From an operational point of view, that doesn't make sense," Quaife said. "Part of the work we are doing is hopefully to be able to not only develop better models but also to develop simplified models that can run in more of an operational setting, rather than something that requires a supercomputer. Obviously, when you're a prescribed fire manager, you cannot remotely log into the Los Alamos supercomputer to run a simulation and wait around for a half hour to figure out what's going to happen in the next 30 seconds."

Kevin Speer, director of the Geophysical Fluid Dynamics Institute at FSU, is contributing to this research.