York astrophysicist Kannan takes step forward in supercomputer simulations of cosmology

York University and an international team of astrophysicists have made an ambitious attempt to simulate the formation of galaxies and the cosmic large-scale structure throughout staggeringly large swaths of space.

The first results of their new calculations for the “MillenniumTNG” project help to subject the standard cosmological model to precision tests and to unravel the full power of upcoming new cosmological observations, say the researchers including York Assistant Professor Rahul Kannan of the Faculty of Science. Figure 1: Projections of gas (top left), dark matter (top right), and stellar light (bottom center) for a slice in the largest hydrodynamical simulation of MillenniumTNG at the present epoch. The slice is about 35 million light-years thick. The projections show the vast physical scales in the simulation from size, about 2400 million light-years across, to an individual spiral galaxy (final round inset) with a radius of ~150 000 light-years. The underlying calculation is presently the largest high-resolution hydrodynamical simulation of galaxy formation, containing more than 160 billion resolution elements © MPA

Over the past decades, cosmologists have gotten used to the perplexing conjecture that the universe’s matter content is dominated by enigmatic dark matter and that an even stranger dark energy field, that acts as some kind of anti-gravity, accelerates the expansion of today’s cosmos. Ordinary baryonic matter makes up less than five percent of the cosmic mix, but this source material forms the basis for the stars and planets of galaxies like our own Milky Way.

This seemingly strange cosmological model is known under the name LCDM. It provides a stubbornly successful description of many observational data, ranging from the cosmic microwave background radiation – the rest-heat left behind by the Big Bang – to the “cosmic web,” where galaxies are arranged along an intricate network of dark matter filaments. However, the real physical nature of dark matter and dark energy is still not understood, prompting astrophysicists to search for cracks in the LCDM theory. Identifying tensions with observational data could lead to a better understanding of these fundamental puzzles about the universe. Sensitive tests are required that need both: powerful new observational data as well as more detailed predictions about what the LCDM model implies.

An international team of researchers led by the Max Planck Institute for Astrophysics (MPA) in Germany, Harvard University in the U.S., Durham University in the U.K., and the Donostia International Physics Center in Spain, along with York University, have now managed to take a decisive step forward on the latter challenge. Building upon their previous successes with the “Millennium” and “IllustrisTNG” projects, they developed a new suite of simulation models dubbed “MillenniumTNG,” which trace the physics of cosmic structure formation with considerably higher statistical accuracy than what was possible with previous calculations.

Large simulations including new physical details

The team utilized the advanced cosmological code GADGET-4, custom-built for this purpose, to compute the largest high-resolution dark matter simulations to date, covering a region nearly 10 billion light-years across. In addition, they employed the moving-mesh hydrodynamical code AREPO to follow the processes of galaxy formation directly, throughout volumes still so large that they can be considered representative of the universe as a whole. Comparing both types of simulations allows a precise assessment of the impact of baryonic processes related to supernova explosions and supermassive black holes on the total matter distribution. Accurate knowledge of this distribution is key for interpreting upcoming observations correctly, such as so-called weak gravitational lensing effects, which respond to matter irrespective of whether it is of dark or baryonic type.

Furthermore, the team included massive neutrinos in their simulations, for the first time in simulations big enough to allow meaningful cosmological mock observations. Previous cosmological simulations had usually omitted them for simplicity, because they make up at most one to two percent of the dark matter mass, and since their nearly relativistic velocities mainly prevent them from clumping together. Now, however, upcoming cosmological surveys (such as those of the recently launched Euclid satellite of the European Space Agency) will reach a precision allowing detection of the associated per-cent-level effects. This raises the tantalizing prospect to constrain the neutrino mass itself, a profound open question in particle physics, so the stakes are high.

For their groundbreaking MillenniumTNG simulations, the researchers made efficient use of two extremely powerful supercomputers, the SuperMUC-NG machine at the Leibniz Supercomputing Center in Garching, and the Cosma8 machine at Durham Universe. More than 120,000 computer cores toiled away for nearly two months at SuperMUC-NG, using computing time awarded by the German Gauss Centre for Supercomputing, to produce the most comprehensive hydrodynamical simulation model to date. MillenniumTNG is tracking the formation of about 100 million galaxies in a region of the universe around 2,400 million light-years across (see Figure 1). This calculation is about 15 times bigger than the previous best in this category, the TNG300 model of the IllustrisTNG project.

Using Cosma8, the team computed an even bigger volume of the universe, filled with more than a trillion dark matter particles and more than 10 billion particles for tracking massive neutrinos. Even though this simulation did not follow the baryonic matter directly, its galaxy content can be accurately predicted in MillenniumTNG with a semi-analytic model that is calibrated against the full physical calculation of the project. This procedure leads to a detailed distribution of galaxies and matter in a volume that, for the first time, is large enough to represent the universe as a whole, comparing upcoming observational surveys on a sound statistical basis.

Mount Sinai researchers build AI model to predict which drugs may cause birth defects

Data harnessed to identify previously unknown associations between genes, congenital disabilities, and drugs

Data scientists at the Icahn School of Medicine at Mount Sinai in New York and colleagues have created an artificial intelligence model that may more accurately predict which existing medicines, not currently classified as harmful, may lead to congenital disabilities.

The model, or “knowledge graph,” also has the potential to predict the involvement of pre-clinical compounds that may harm the developing fetus. The study is the first known of its kind to use knowledge graphs to integrate various data types to investigate the causes of congenital disabilities.

Birth defects are abnormalities that affect about 1 in 33 births in the United States. They can be functional or structural and are believed to result from various factors, including genetics. However, the causes of most of these disabilities remain unknown. Certain substances found in medicines, cosmetics, food, and environmental pollutants can potentially lead to birth defects if exposed during pregnancy.

“We wanted to improve our understanding of reproductive health and fetal development, and importantly, warn about the potential of new drugs to cause birth defects before these drugs are widely marketed and distributed,” says Avi Ma’ayan, Ph.D., Professor, Pharmacological Sciences, and Director of the Mount Sinai Center for Bioinformatics at Icahn Mount Sinai, and senior author of the paper. “Although identifying the underlying causes is a complicated task, we offer hope that through complex data analysis like this that integrates evidence from multiple sources, we will be able, in some cases, to better predict, regulate, and protect against the significant harm that congenital disabilities could cause.”

The researchers gathered knowledge across several datasets on birth-defect associations noted in published work, including those produced by NIH Common Fund programs, to demonstrate how integrating data from these resources can lead to synergistic discoveries. Particularly, the combined data is from the known genetics of reproductive health, the classification of medicines based on their risk during pregnancy, and how drugs and pre-clinical compounds affect the biological mechanisms inside human cells.

Specifically, the data included studies on genetic associations, drug- and preclinical-compound-induced gene expression changes in cell lines, known drug targets, genetic burden scores for human genes, and placental crossing scores for small molecule drugs.

Importantly, using ReproTox-KG, with semi-supervised learning (SSL), the research team prioritized 30,000 preclinical small molecule drugs for their potential to cross the placenta and induce birth defects. SSL is a branch of machine learning that uses a small amount of labeled data to guide predictions for much larger unlabeled data. In addition, by analyzing the topology of the ReproTox-KG more than 500 birth-defect/gene/drug cliques were identified that could explain molecular mechanisms that underlie drug-induced birth defects. In graph theory terms, cliques are subsets of a graph where all the nodes in the clique are directly connected to all other nodes in the clique.

The investigators caution that the study's findings are preliminary and that further experiments are needed for validation.

Next, the investigators plan to use a similar graph-based approach for other projects focusing on the relationship between genes, drugs, and diseases. They also aim to use the processed dataset as training materials for courses and workshops on bioinformatics analysis. In addition, they plan to extend the study to consider more complex data, such as gene expression from specific tissues and cell types collected at multiple stages of development.

“We hope that our collaborative work will lead to a new global framework to assess potential toxicity for new drugs and explain the biological mechanisms by which some drugs, known to cause birth defects, may operate. It’s possible that at some point in the future, regulatory agencies such as the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency may use this approach to evaluate the risk of new drugs or other chemical applications,” says Dr. Ma’ayan.

The project was supported by National Institutes of Health grants OT2OD030160, OT2OD030546, OT2OD032619, and OT2OD030162. 

Scanning tunnelling microscope image of two of the superconducting structures created, which consist of individual chromium atoms.
Scanning tunnelling microscope image of two of the superconducting structures created, which consist of individual chromium atoms.

University of Zurich prof Neupert designs superconductors one atom at a time

The future of electronics will be based on novel kinds of materials. Sometimes, however, the naturally occurring topology of atoms makes it difficult for new physical effects to be created. To tackle this problem, researchers at the University of Zurich have now successfully designed superconductors one atom at a time, creating new states of matter.

What will the computer of the future look like? How will it work? The search for answers to these questions is a major driver of basic physical research. There are several possible scenarios, ranging from the further development of classical electronics to neuromorphic supercomputing and quantum supercomputers. The common element in all these approaches is that they are based on novel physical effects, some of which have so far only been predicted in theory. Researchers go to great lengths and use state-of-the-art equipment in their quest for new quantum materials that will enable them to create such effects. But what if there are no suitable materials that occur naturally?

A novel approach to Superconductivity

In a recent study published in Nature Physics, the research group of UZH Professor Titus Neupert, working closely together with physicists at the Max Planck Institute of Microstructure Physics in Halle (Germany), presented a possible solution. The researchers made the required materials themselves – one atom at a time. They are focusing on novel types of superconductors, which are particularly interesting because they offer zero electrical resistance at low temperatures. Sometimes referred to as “ideal diamagnets”, superconductors are used in many quantum computers due to their extraordinary interactions with magnetic fields. Theoretical physicists have spent years researching and predicting various superconducting states. “However, only a small number have so far been conclusively demonstrated in materials,” says Professor Neupert.

Two new types of superconductivity

In their exciting collaboration, the UZH researchers predicted in theory how the atoms should be arranged to create a new superconductive phase, and the team in Germany then conducted experiments to implement the relevant topology. Using a scanning tunneling microscope, they moved and deposited the atoms in the right place with atomic precision. The same method was also used to measure the system’s magnetic and superconductive properties. By depositing chromium atoms on the surface of superconducting niobium, the researchers were able to create two new types of superconductivity. Similar methods had previously been used to manipulate metal atoms and molecules, but until now it has never been possible to make two-dimensional superconductors with this approach. 

The results not only confirm the physicists’ theoretical predictions, but also give them reason to speculate about what other new states of matter might be created in this way, and how they could be used in the quantum supercomputers of the future.