HKU physicists discover signatures of highly entangled quantum matter

A research team from the Department of Physics, The University of Hong Kong (HKU), discovered clear evidence to characterize a highly entangled quantum matter—the quantum spin liquid (QSL) (a phase of matter that remains disordered even at shallow temperatures) from large-scale simulations on supercomputers. This pivotal research work has recently been published in one of the leading journals in quantum materials—npj quantum materialsA photo of part of the research team. From the left: Dr Zheng Yan, Mr Jiarui Zhao, and Dr Bin-Bin Chen.

QSLs were proposed by P. W. Anderson—the Nobel Physics Laureate of 1977—in 1973, which had the potential to be used in topological quantum computing to bring the computing power of computers to a new stage and to help understand the mechanism of high-temperature superconductors, that could greatly reduce the energy cost during electricity transport owing to the absence of electrical resistance in superconductors.  

The QSL is termed a liquid due to its lack of conventional order in the matter. QSLs have a topological order that originates from long-range and strong quantum entanglement, while the detection of this topological order is a very tough task due to the lack of materials that can perfectly achieve the many model systems that scientists propose to find a topological order of QSL and prove its existence. Thus, there has not been concrete evidence that QSLs exist in nature.

Under this context, Mr. Jiarui ZHAO, Dr. Bin-Bin CHEN, Dr. Zheng YAN, and Dr. Zi Yang MENG from HKU Department of Physics, successfully probed this topological order in a phase of the Kagome lattice quantum spin model, which is a two-dimensional lattice model with intrinsic quantum entanglement and proposed by scientists that have Z2 (a cyclic group of order 2) topological order, via a carefully designed numerical experiment on supercomputers. Their unambiguous results of topological entanglement entropy strongly suggest the existence of QSLs in highly entanglement quantum models from a numerical perspective.

‘Our work takes advantage of the superior computing power of modern supercomputers, and we use them to simulate a very complicated model which is thought to possess topological order. With our findings, physicists are more confident that QSLs should exist in nature,’ said Mr. Jiarui Zhao, the first author of the journal paper and a Ph.D. student at the Department of Physics.

"Numerical simulations have been an important trend in scientific research of quantum materials. Our algorithms and computations could find more interesting and novel quantum matter and such efforts will surely contribute to the development of both practical quantum technology and the new paradigm in fundamental research." Dr. Zi Yang Meng, Associate Professor in the Department of Physics remarked.

The research
The team designed a numerical experiment on the Kagome spin model (Kagome is a two-dimensional lattice structure that shows a similar pattern to a  traditional Japanese woven bamboo pattern in the shape of hexagonal latticework) in the proposed QSL phase, and the schematic plot of the experiment is illustrated in Figure 2. The entanglement entropy (S) of a system can be obtained by measuring the change of the free energy of the model during a carefully designed nonequilibrium process. The topological entropy (γ), which characterizes long-range topological order, can be extracted by subtracting the short-range contribution, which is proportional to the length of the entanglement boundary (l) from the total entanglement entropy(S), by fitting the data of entanglement entropy of different entanglement boundary length to a straight line (S=al-γ).

The team experimented on two kinds of lattices with different ratios of length and width to ensure the reliability of the results. We use a straight line to fit the relation between the entanglement entropy with the length of the entanglement boundary so that the topological entropy should equal the intercept of the straight line. Our results give the value of topological entropy to be 1.4(2), which is consistent with the predicted value of topological entropy of a Z2 quantum spin liquid, which is 2ln (2). Our findings confirm the existence of QSLs from a numerical perspective.

Extensive training on virtual universes from supercomputer simulations produces AI-assisted analysis of three-dimensional galaxy distribution in our Universe

By applying a machine-learning technique, a neural network method, to gigantic amounts of simulation data about the formation of cosmic structures in the universe, a team of researchers has developed a very fast and highly efficient software program that can make theoretical predictions about structure formation. By comparing model predictions to actual observational datasets, the team succeeded in accurately measuring cosmological parameters, reports a study in Physical Review DDistribution of about 1 million galaxies observed by Sloan Digital Sky Survey (top left) and a zoom-in image of the thin rectangular region (bottom left). This can be compared to the distribution of invisible dark matter predicted by supercomputer simulation assuming the cosmological model that our AI derives (top right). The bottom right shows the distribution of mock galaxies that are formed in regions with high dark matter density. The predicted galaxy distribution shares the characteristic patterns such as galaxy clusters, filaments and voids seen in the actual SDSS data. (Credit: Takahiro Nishimichi)  CREDIT Takahiro Nishimichi
 
When the biggest galaxy survey to date in the world, the Sloan Digital Sky Survey (SDSS), created a three-dimensional map of the universe via the observed distribution of galaxies, it became clear that galaxies had certain characteristics. Some would clump together, or spread out in filaments, and in some places, there were voids where no galaxies existed at all. All these show galaxies did not evolve uniformly, they formed as a result of their local environment.
 
In general, researchers agree this non-uniform distribution of galaxies is because of the effects of gravity caused by the distribution of “invisible” dark matter, the mysterious matter that no one has yet directly observed. By studying the data in the three-dimensional map of galaxies in detail, researchers could uncover the fundamental quantities such as the amount of dark matter in the universe.
 
In most recent years, N-body simulations have been widely used in studies recreating the formation of cosmic structures in the universe. These simulations mimic the initial inhomogeneities at high redshifts by a large number of N-body particles that effectively represent dark matter particles and then simulate how dark matter distribution evolves, by computing gravitational pulling forces between particles in an expanding universe. However, the simulations are usually expensive, taking tens of hours to complete on a supercomputer, even for one cosmological model.

A team of researchers, led by former Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) Project Researcher Yosuke Kobayashi (currently Postdoctoral Research Associate at The University of Arizona), and including Kavli IPMU Professor Masahiro Takada and Kavli IPMU Visiting Scientists Takahiro Nishimichi and Hironao Miyatake, combined machine learning with numerical simulation data by the supercomputer “ATERUI II” at the National Astronomical  Observatory of Japan (NAOJ) to generate theoretical calculations of the power spectrum, the most fundamental quantity measured from galaxy surveys which tells researchers statistically how galaxies are distributed in the universe.
 
Usually, several millions of N-body simulations would need to be run, but Kobayashi’s team was able to use machine learning to teach their program to calculate the power spectrum at the same level of accuracy as a simulation, even for a cosmological model for which the simulation had not yet been run. This technology is called an Emulator and is already being used in computer science fields outside of astronomy.
 
"By combining machine learning with numerical simulations, which cost a lot, we have been able to analyze data from astronomical observations with high precision. These emulators have been used in cosmology studies before, but hardly anyone has been able to take into account the numerous other effects, which would compromise cosmological parameter results using real galaxy survey data. Our emulator does and has been able to analyze real observation data. This study has opened up a new frontier to large-scale structural data analysis," said the lead author Kobayashi.
 
However, to apply the emulator to actual galaxy survey data, the team had to take into account “galaxy bias” uncertainty, an uncertainty taking into consideration that researchers cannot accurately predict where galaxies form in the universe because of the complicated physics inherent in galaxy formation. To overcome this difficulty, the team focused on simulating the distribution of dark matter “halos”, where there is a high density of dark matter and a high probability of galaxies forming. The team succeeded in making a flexible model prediction for a given cosmological model, by introducing a sufficient number of “nuisance” parameters to take into account the galaxy bias uncertainty.

Then the team compared the model prediction to an actual SDSS data set, and successfully measured cosmological parameters to high precision. It confirms as an independent analysis that only about 30 percent of all energy comes from matter (mainly dark matter) and that the remaining 70 percent is the result of dark energy causing the accelerated expansion of the universe. They also succeeded to measure the clumpiness of matter in our universe, while the conventional method used to analyze the galaxy 3D maps was not able to determine these two parameters simultaneously. The precision of their parameter measurement exceeds that obtained by the previous analyses of galaxy surveys. These results demonstrate the effectiveness of the emulator developed in this study. 
 
The next step for the research team will be to continue to study dark matter mass and the nature of dark energy by applying their emulator to galaxy maps that will be captured by the Prime Focus Spectrograph, under development, led by the Kavli IPMU, to be mounted on NAOJ’s Subaru Telescope.
 

Datadobi added to the IT Enterprise Solutions Software 2 (ITES-SW2) Contract for U.S. Army Computer Hardware Enterprise Software and Solutions (CHESS)

The ITES-SW2 Contract distinction will allow Datadobi to support all federal agencies’ unstructured data management goals

Datadobi has been added as a manufacturer on Iron Bow Technologies' (ITES-SW2) contract for the U.S. Army Computer Hardware Enterprise Software and Solutions (CHESS).

ITES-SW2 is a firm-fixed-price, indefinite-delivery/indefinite-quantity contract vehicle for commercial off-the-shelf software products and related services and hardware. The contract has no fees and ordering is open to all Army, DoD, Federal agencies, and authorized systems integrators on a worldwide basis. Under this contract, Iron Bow provides software, software maintenance, and ancillary services from Datadobi to support federal agencies’ enterprise infrastructure goals.

The ITES-SW2 contract is the latest step Datadobi has made in recent years to expand its offerings in the federal space. The Datadobi product suite is also featured on Climb Channel Solution’s GSA IT 70 Contract and the company is also on the U.S. federal government list of Solutions for Enterprise-Wide Procurement (SEWP) vendors. All three distinctions allow Datadobi the opportunity to alleviate the top four major critical concerns for enterprises (cost control, carbon footprint, risk reduction, and getting more value from unstructured data) in the public sector via its StorageMAP and DobiMigrate solutions.

Through the ITES-SW2 contract, Datadobi will bring order to the public sector's heterogeneous unstructured storage environments both in the cloud and the data center, allowing government organizations to realize the true value of their extensive unstructured data.

"Through our partnership with Iron Bow Technologies we’re excited to continue our support of all the Federal Government Agencies through the ITES-SW2 vehicle,” said Jeff Abbott, Director of US Federal Sales at Datadobi. “Our software is used widely across the DoD and the broader U.S. Government to discover, analyze and migrate unstructured data footprints, this now makes it easier for our customers to acquire StorageMAP and DobiMigrate software across all agencies. As the global leader in unstructured data intelligence and mobility, we’re excited to provide government agencies with the most comprehensive platform for understanding the scope, nature, and cost impact of their data while providing the ability to manage and relocate that data at unparalleled speed and scale.”

University of Seville uses CFD to design a new bio-inspired PEM fuel cell

A team from the Department of Energy Engineering at the University of Seville has developed experimental research focused on the design of a bio-inspired PEM fuel cell. The model they have obtained has reached a maximum power which is up to 6.0% higher than the design they took as a reference. PEM fuel cell

Proton Exchange Membrane Fuel Cells (PEMFC) are electrochemical devices that directly convert the chemical energy of fuels such as hydrogen into electricity, with high efficiency and creating only water as a by-product. The geometry of the channels in the bipolar plate through which the reactants are distributed has a considerable impact on the performance of the fuel cell.

Bipolar plate designs based on nature-inspired structures such as leaves, lungs or sponges have been successfully explored so far but have not yet reached their full potential.

With the aim of investigating new designs with improved performance, this work presents an experimental analysis of a novel bioinspired design of the channels of a PEMFC. Starting from a Computational Fluid Dynamics (CFD) analysis of the flow of different initial biomimetic designs, the one that presented the best performance in terms of reactant distribution, which includes insertions of porous material in the central area of ​​the plate instead of channels, was selected, manufactured and experimentally tested.

The results of the new biomimetic design were analyzed and compared with a parallel coil model, which was taken as a reference, indicating that the proposed new design is especially suitable for improving fuel cell water management in high humidity conditions. reactants, achieving up to 6.0% higher peak power compared to the reference design.

University of Limerick, Ireland research could reduce time required to bring new medicines to market

Researchers at the University of Limerick in Ireland have developed a new modeling approach to pharmaceutical manufacturing that could reduce the time required to bring medicines to market.

Professor Gavin Walker at the University of Limerick’s Bernal Institute has had a world’s first paper published in applying molecular engineering methodologies to continuous pharmaceutical manufacturing.

The study addresses a significant public health issue of reducing the time required to bring new medicines to market for the benefit of patients and society.

The pharmaceutical industry has recently been increasing research in continuous manufacturing techniques to decrease the manufacturing costs of medical products, making them more affordable and getting them to more consumers more rapidly at a reduced carbon and environmental footprint.

The UL research emphasizes the increasing significance of combining process engineering, modeling, and data science to better understand processes at a molecular scale for the optimization of pharmaceutical manufacturing.

Professor Gavin Walker, Bernal Chair of Pharmaceutical Powder Engineering and project lead, explained that: “This contribution offers a ‘proof of concept’ to make it achievable to model specific co-crystals at a molecular scale within a continuous pharmaceutical manufacturing process.

“Molecular interactions can be altered to optimize drug properties and this process can be crucial to the performance of a dosage form which links to the preparation of the safe delivery of the content of the drug product for the ultimate benefits of patients and society.

“There is huge value in improving the productivity of the drug development process. This study expands on possibilities that exist for future development of progressing towards more supportive mechanisms in the pharmaceutical manufacturing space, improving processing and reducing time to market for new medicine,” he added.

The study was led by Professor Walker and funded through CONFIRM, the SFI Research Centre for Smart Manufacturing, and SSPC, the SFI Research Centre for Pharmaceuticals, which are both based at UL, and European funding through the MSCA ‘Process’ Co-Fund.

Professor Walker said of the research: “It will aid the current pharmaceutical development processes of exhaustive empirical experimentation, in that time and cost can be reduced through this more controlled and targeted approach via Smart Manufacturing techniques.

“The paper represents a significant bridge by adapting mathematical modeling developed in the discrete manufacturing sector into effective techniques for improving continuous manufacturing within the pharma-biopharma sector.

“This is critical to achieving UN Sustainable Development grand challenges in good health and wellbeing, as well as ensuring healthy lives and promoting well-being for all ages, optimizing biopharma processing and reducing time to market for new medicines,” he added.

Speaking about the research output, SSPC Director, Professor Damien Thompson, said: “The paper represents a significant deployment of data-driven molecular modeling for improving continuous manufacturing within the pharma-biopharma sector. It is great to see such impact from collaborative work bridging two SFI research centers hosted at the University of Limerick.”

Dr. Niall Keely, CONFIRM Strategic Research Manager, added: “The research presented in this paper highlights the significant impact of combining multiple sciences and engineering disciplines to advance knowledge of processes at the molecular level that ultimately can lead to benefits at the industrial scale and improved business processes such as faster time-to-market of products.”