An artist's impression of the ultra-sensitive spin detection device. Image: Tony Melov
An artist's impression of the ultra-sensitive spin detection device. Image: Tony Melov

Australian engineer Pla invents an ultra-sensitive spin-measuring device

This new spin-measuring device could help scientists - particularly in chemistry and biology - understand the structure and purpose of materials better.

In a paper published over the weekend in the journal Science AdvancesAssociate Professor Jarryd Pla and his team from UNSW School of Electrical Engineering and Telecommunications, together with colleague Scientia Professor Andrea Morello, described a new device that can measure the spins in materials with high precision. The University of New South Wales, also known as UNSW Sydney, is a public research university based in Sydney, New South Wales, Australia. It is one of the founding members of Group of Eight, a coalition of Australian research-intensive universities.

“The spin of an electron – whether it points up or down – is a fundamental property of nature,” says A/Prof. Pla. “It is used in magnetic hard disks to store information, MRI machines use the spins of water molecules to create images of the inside of our bodies, and spins are even being used to build quantum supercomputers.

“Being able to detect the spins inside materials is therefore important for a whole range of applications, particularly in chemistry and biology where it can be used to understand the structure and purpose of materials, allowing us to design better chemicals, drugs, and so on.”

In fields of research such as chemistry, biology, physics, and medicine, the tool that is used to measure spins is called a spin resonance spectrometer. Normally, commercially produced spectrometers require billions to trillions of spins to get an accurate reading, but A/Prof. Pla and his colleagues were able to measure spins of electrons in the order of thousands, meaning the new tool was about a million times more sensitive.

This is quite a feat, as there is a whole range of systems that cannot be measured with commercial tools, such as microscopic samples, two-dimensional materials, and high-quality solar cells, which simply have too few spins to create a measurable signal.

The breakthrough came about almost by chance, as the team was developing a quantum memory element for a superconducting quantum computer. The objective of the memory element was to transfer quantum information from a superconducting electrical circuit to an ensemble of spins placed beneath the circuit.

“We noticed that while the device didn’t quite work as planned as a memory element, it was extremely good at measuring the spin ensemble,” says Wyatt Vine, a lead author on the study. “We found that by sending microwave power into the device as the spins emitted their signals, we could amplify the signals before they left the device. What’s more, this amplification could be performed with very little added noise, almost reaching the limit set by quantum mechanics.”

While other highly sensitive spectrometers using superconducting circuits had been developed in the past, they required multiple components, were incompatible with magnetic fields, and had to be operated in very cold environments using expensive “dilution refrigerators”, which reach temperatures down to 0.01 Kelvin.

In this new development, A/Prof. Pla says he and the team managed to integrate the components on a single chip.

“Our new technology integrates several important parts of the spectrometer into one device and is compatible with relatively large magnetic fields. This is important since measuring the spins they need to be placed in a field of about 0.5 Tesla, which is ten thousand times stronger than the earth’s magnetic field.

“Further, our device operated at a temperature more than 10 times higher than previous demonstrations, meaning we don’t need to use a dilution refrigerator.”

A/Prof. Pla says the UNSW team has patented the technology with a view to potentially commercialize it but stresses that there is still work to be done.

“There is potential to package this thing up and commercialize it which will allow other researchers to plug it into their existing commercial systems to give them a sensitivity gain.

“If this new technology was properly developed, it could help chemists, biologists, and medical researchers, who currently rely on tools made by these large tech companies that work, but which could do something orders of magnitude better.”

Local stress in the bridge structure just as lateral displacements get drastically large (deformations have been magnified five times).  CREDIT Tokyo Metropolitan University
Local stress in the bridge structure just as lateral displacements get drastically large (deformations have been magnified five times). CREDIT Tokyo Metropolitan University

Japanese researchers perform pushover analyses with nonlinear FEM of the damage on bridges during earthquakes

Detailed model highlights how important girder end design is for improving resilience

Researchers from Tokyo Metropolitan University have carried out a detailed simulation showing how a common type of bridge fails during large-scale earthquakes. They modeled “I-shaped girder” bridges, looking at the step-by-step mechanism by which they yield and deform under lateral forces, starting at the ends. Reinforcing ribs were shown to be effective against lateral forces and improve load-bearing capacity. Their work points bridge engineers to rational design strategies to make more resilient infrastructure.

Major earthquakes can have a devastating impact on infrastructure. The effects of a severely damaged bridge, for example, are not limited to the tragedy that befalls people on it but extends to how the loss of access affects emergency services, evacuation efforts, and the transport of crucial supplies. Understanding how seismic activity impacts common bridge structures is therefore crucial, not only to build bridges that can withstand strong quakes but how to prevent the failure of existing ones through effective reinforcement. Though numerical models exist which are used to assess the resilience of bridge superstructures, for the most part, there are very few examples that examine how each part of the whole bridge structure behaves during large-scale earthquakes. As different parts of the structure show yielding, we can see that the displacement in the lateral direction gets larger more quickly with increased force.  CREDIT Tokyo Metropolitan University

A team led by Professor Jun Murakoshi of Tokyo Metropolitan University has been studying detailed models which accurately reflect the actual behavior of entire structures, with a focus on how they might inform new design strategies. They looked at the failure process and impact on load-bearing capacity caused by lateral shaking of an “I-shaped girder” bridge, a common bridge type with a span length of 30m; supported steel girders shaped to have a cross-section that looks like a capital “I” carry a flat “deck slab” over which cars and people can pass. They subjected their model bridge to the lateral forces commonly seen during quakes, considering the response when the force was applied in the longitudinal and transverse directions to the girders.

The model revealed a detailed picture of how the bridge yields and deforms. For example, when the force was applied in the transverse direction, the first place to get affected was the lower part of the vertical stiffeners on the support, followed by the yielding of the diagonal members of the end cross frame. The vertical stiffeners then go on to yield until finally, the gusset plate (a steel plate that connects lateral members) starts to deform. Though this does not lead to bridges failing, there are already reports of deformations impeding the passage of emergency vehicles after large-scale earthquakes. The numerical model constructed by the team includes the girders, lateral members, supports, and the deck on top.  CREDIT Tokyo Metropolitan University

The question now becomes how we might prevent this from happening. The team went on to study the effect of reinforcing ribs on the structure: a model with reinforcing ribs showed how stress acting on the girders and the end cross frame connecting them was reduced. The team’s work thus provides rational insight into how bridge structures may be designed and reinforced to make our infrastructure safer, as well as better strategies to assess their safety.

This work was supported by the Japan Iron and Steel Federation.

 

Japanese scientists reveal distribution of dark matter around galaxies 12 billion years ago–further back in time than ever before

A collaboration led by scientists at Nagoya University in Japan has investigated the nature of dark matter surrounding galaxies seen as they were 12 billion years ago, billions of years further back in time than ever before. Their findings, published in Physical Review Letters, offer the tantalizing possibility that the fundamental rules of cosmology may differ when examining the early history of our universe. 

Seeing something that happened such a long time ago is difficult. Because of the finite speed of light, we see distant galaxies not as they are today, but as they were billions of years ago. But even more, challenging is observing dark matter, which does not emit light.  

Consider a distant source galaxy, even further away than the galaxy whose dark matter one wants to investigate. The gravitational pull of the foreground galaxy, including its dark matter, distorts the surrounding space and time, as predicted by Einstein’s theory of general relativity. As the light from the source galaxy travels through this distortion, it bends, changing the apparent shape of the galaxy. The greater the amount of dark matter, the greater the distortion. Thus, scientists can measure the amount of dark matter around the foreground galaxy (the “lens” galaxy) from the distortion.    

However, beyond a certain point, scientists encounter a problem. The galaxies in the deepest reaches of the universe are incredibly faint. As a result, the further away from Earth we look, the less effective this technique becomes. The lensing distortion is subtle and difficult to detect in most cases, so many background galaxies are necessary to detect the signal.  

Most previous studies have remained stuck at the same limits. Unable to detect enough distant source galaxies to measure the distortion, they could only analyze the dark matter from no more than 8-10 billion years ago. These limitations left open the question of the distribution of dark matter between this time and 13.7 billion years ago, around the beginning of our universe. 

To overcome these challenges and observe dark matter from the furthest reaches of the universe, a research team led by Hironao Miyatake from Nagoya University, in collaboration with the University of Tokyo, the National Astronomical Observatory of Japan, and Princeton University, used a different source of background light, the microwaves released from the Big Bang itself.  

First, using data from the observations of the Subaru Hyper Suprime-Cam Survey (HSC), the team identified 1.5 million lens galaxies using visible light, selected to be seen 12 billion years ago.  

Next, to overcome the lack of galaxy light even further away, they employed microwaves from the cosmic microwave background (CMB), the radiation residue from the Big Bang. Using microwaves observed by the European Space Agency’s Planck satellite, the team measured how the dark matter around the lens galaxies distorted the microwaves.   

“Look at dark matter around distant galaxies?” asked Professor Masami Ouchi of the University of Tokyo, who made many of the observations. “It was a crazy idea. No one realized we could do this. But after I gave a talk about a large distant galaxy sample, Hironao came to me and said it may be possible to look at dark matter around these galaxies with the CMB.”  

“Most researchers use source galaxies to measure dark matter distribution from the present to eight billion years ago”, added Assistant Professor Yuichi Harikane of the Institute for Cosmic Ray Research, University of Tokyo. “However, we could look further back into the past because we used the more distant CMB to measure dark matter. For the first time, we were measuring dark matter from almost the earliest moments of the universe.” 

After a preliminary analysis, the researchers soon realized that they had a large enough sample to detect the distribution of dark matter. Combining the large distant galaxy sample and the lensing distortions in CMB, they detected dark matter even further back in time, from 12 billion years ago. This is only 1.7 billion years after the beginning of the universe, and thus these galaxies are seen soon after they first formed. 

“I was happy that we opened a new window into that era,” Miyatake said. "12 billion years ago, things were very different. You see more galaxies that are in the process of formation than at the present; the first galaxy clusters are starting to form as well.” Galaxy clusters comprise 100-1000 galaxies bound by gravity with large amounts of dark matter. 

“This result gives a very consistent picture of galaxies and their evolution, as well as the dark matter in and around galaxies, and how this picture evolves with time,” said Neta Bahcall,  Eugene Higgins Professor of Astronomy, professor of astrophysical sciences, and director of undergraduate studies at Princeton University. The radiation residue from the Big Bang, distorted by dark matter 12 billion years ago.  CREDIT Reiko Matsushita

One of the most exciting findings of the researchers was related to the clumpiness of dark matter. According to the standard theory of cosmology, the Lambda-CDM model, subtle fluctuations in the CMB form pools of densely packed matter by attracting surrounding matter through gravity. This creates inhomogeneous clumps that form stars and galaxies in these dense regions. The group’s findings suggest that their clumpiness measurement was lower than predicted by the Lambda-CDM model.  

Miyatake is enthusiastic about the possibilities. “Our finding is still uncertain”, he said. “But if it is true, it would suggest that the entire model is flawed as you go further back in time. This is exciting because if the result holds after the uncertainties are reduced, it could suggest an improvement of the model that may provide insight into the nature of dark matter itself.” 

“At this point, we will try to get better data to see if the Lambda-CDM model is actually able to explain the observations that we have in the universe,” said Andrés Plazas Malagón, associate research scholar at Princeton University. “And the consequence may be that we need to revisit the assumptions that went into this model.” 

“One of the strengths of looking at the universe using large-scale surveys, such as the ones used in this research, is that you can study everything that you see in the resulting images, from nearby asteroids in our solar system to the most distant galaxies from the early universe. You can use the same data to explore a lot of new questions,” said Michael Strauss, professor, and chair of the Department of Astrophysical Sciences at Princeton University. 

This study used data available from existing telescopes, including Planck and Subaru. The group has only reviewed a third of the Subaru Hyper Suprime-Cam Survey data. The next step will be to analyze the entire data set, which should allow for a more precise measurement of the dark matter distribution. In the future, the team expects to use an advanced data set like the Vera C. Rubin Observatory's Legacy Survey of Space and Time (LSST) to explore more of the earliest parts of space. “LSST will allow us to observe half the sky,” Harikane said. “I don’t see any reason we couldn’t see the dark matter distribution 13 billion years ago next.”