Left to right: Prof. Shlomi Reuveni, Ph.D. student Ofir Blumer & Dr. Barak Hirshberg
Left to right: Prof. Shlomi Reuveni, Ph.D. student Ofir Blumer & Dr. Barak Hirshberg

Restarting chemical simulations revolutionizes scientific breakthroughs

In the fast-paced world of chemical research, a groundbreaking study from Tel Aviv University in Israel has unveiled a game-changing technique that could potentially reshape the landscape of scientific exploration and accelerate valuable discoveries. By drawing inspiration from the world of information technology, researchers have successfully demonstrated how the simple act of "restarting" can vastly enhance the sampling in chemical simulations, pushing the boundaries of what is possible in this field. This remarkable achievement not only showcases the power of supercomputing but also highlights the importance of embracing diverse perspectives in advancing scientific knowledge.

Conducted by a team led by Ph.D. student Ofir Blumer in collaboration with Professor Shlomi Reuveni and Dr. Barak Hirshberg from the Sackler School of Chemistry, this study holds promising implications for molecular dynamics simulations. These simulations, often referred to as virtual microscopes, track the intricate motion of atoms in various chemical, physical, and biological systems. They provide valuable insights into processes that range from protein folding to crystal nucleation and hold immense potential in fields like drug design.

However, a significant challenge called the "timescale problem" has long hampered these simulations. They are typically unable to depict processes occurring slower than one millionth of a second, restricting their ability to capture essential phenomena. In a stroke of innovative thinking, the researchers harnessed the concept of "stochastic resetting" commonly employed in information technology, and applied it to chemical simulations.
It may initially seem counterintuitive that restarting simulations can yield faster results. However, the study revealed that reaction times vary significantly across simulations. Some simulations become trapped in intermediate states for extended periods, while others experience rapid reactions. Resetting prevents simulations from getting stuck in these intermediates, ultimately shortening the average simulation time and overcoming the timescale problem.

The researchers further integrated stochastic resetting with Metadynamics, a popular method for expediting the simulations of slow chemical processes. This powerful combination produced more substantial acceleration than each method alone, reducing reliance on prior knowledge and drastically saving time for practitioners. Importantly, the researchers showcased the effectiveness of this combined approach in accurately predicting the rate of slow processes, as validated by successful protein folding simulations.

Diverse perspectives have played a pivotal role in shaping this groundbreaking research. Collaborative efforts between passionate individuals from various backgrounds exemplify the inclusive nature of scientific exploration. By embracing different viewpoints and insights, this project has successfully pushed the boundaries of what can be achieved, sparking excitement and inspiration within the scientific community.

At a time when the world faces complex challenges that demand innovative solutions, the potential impact of this research is immense. Not only does it open new avenues for understanding fundamental chemical processes, but it also presents opportunities for groundbreaking advancements in drug development, materials science, and numerous other fields.

Through the tireless efforts of researchers at Tel Aviv University, the boundaries of what can be achieved with supercomputing and diverse perspectives have been stretched. This achievement reminds us that true scientific progress often arises from the unification of seemingly unrelated disciplines. By thinking beyond traditional boundaries and being open to novel approaches, we can unlock a world of possibilities and accelerate the pace of discovery.

As this exciting research takes its place in history, it serves as a testimony to the potential for groundbreaking scientific breakthroughs when driven by collaboration, innovation, and an unwavering commitment to pushing the boundaries of knowledge. The lessons learned from restarting simulations will undoubtedly pave the way for a new era of research and inspire future generations of scientific explorers to embrace diverse perspectives and never stop pushing the limits of human understanding.

C. “Sesh” Seshadhri
C. “Sesh” Seshadhri

Study casts doubts on the reliability of machine learning methods

In today's digital era, machine learning plays a vital role in our lives by driving social media expansion and shaping various scientific research fields. However, a recent study by UC Santa Cruz has raised concerns about the reliability of widespread machine learning methods behind link prediction.

Link prediction is a popular machine learning task that evaluates the links in a network and predicts future connections. From suggesting friends on social media to predicting the interaction between genes and proteins, link prediction has become a benchmark for testing the performance of machine learning algorithms. But is it trustworthy?

The study by UC Santa Cruz Professor of Computer Science and Engineering C. "Sesh" Seshadhri, in collaboration with Nicolas Menand, reveals the flaws in evaluating the accuracy of link prediction. The commonly used metric for measuring link prediction performance, known as AUC, fails to capture crucial information, thereby giving an exaggerated sense of success.

Seshadhri, a respected figure in theoretical computer science and data mining, discovered mathematical limitations hindering the performance of machine learning algorithms. His investigation into link prediction revealed that the seemingly impressive results may not reflect reality. According to Seshadhri, "It feels like if you measured things differently, maybe you wouldn't see such great results."

Low-dimensional vector embeddings are the key to link prediction, a process that represents individuals within a network as mathematical vectors in space. However, the study finds that AUC, the most commonly used metric, fails to account for fundamental mathematical limitations. This ultimately creates an inaccurate measure of link prediction performance.

The study's findings cast doubt on the widespread use of low-dimensional vector embeddings in machine learning, challenging the notion that these methods are as effective as previously thought. Seshadhri and Menand introduced a new metric, VCMPR, to capture the limitations more comprehensively. Interestingly, when using VCMPR, most leading methods in the field performed poorly. This calls into question the reliability of these algorithms.

Beyond the immediate concern for machine learning accuracy, this research has broader implications for trustworthiness and decision-making in machine learning. Using flawed metrics to assess performance could lead to flawed decision-making in real-world machine-learning applications. Seshadhri asks, "If you have the wrong way of measuring, how can you trust the results?"

While some may argue that these findings are not surprising to those deeply entrenched in the field, the wider community of machine learning researchers needs to take note of this skepticism. The study challenges the dominant philosophy within machine learning, urging researchers to question the validity of metrics and strive for a more comprehensive understanding of their experiments.

In a world where machine learning extends beyond its domain and significantly impacts various fields such as biology, accuracy and trustworthiness are paramount. Biologists utilizing link prediction to identify potential protein interactions in drug discovery, for instance, heavily rely on the expertise of machine learning practitioners to produce reliable tools.

This study, funded by the National Science Foundation and the Army Research Office, serves as a cautionary tale for the machine learning community. It reminds us of the need to approach research with skepticism and constantly question the accuracy of our methodologies. True progress lies in the pursuit of a deeper understanding rather than just chasing higher scores on flawed metrics.

As the field of machine learning continues to evolve, researchers and practitioners must consider diverse perspectives, challenge conventional wisdom, and prioritize the development of more accurate and trustworthy methods. Only then can we fully harness the potential of machine learning while ensuring its reliability and impact on society?

The following information pertains to the drought status of the Amazon River basin from June to November 2023. The classification system used is the U.S. Drought Monitor. According to the analysis conducted by World Weather Attribution and presented by Ben Clarke, a large portion of the eastern half of the basin along with certain areas in the western half are experiencing extreme or exceptional drought conditions. The image used in this report is provided by NOAA Climate.gov.
The following information pertains to the drought status of the Amazon River basin from June to November 2023. The classification system used is the U.S. Drought Monitor. According to the analysis conducted by World Weather Attribution and presented by Ben Clarke, a large portion of the eastern half of the basin along with certain areas in the western half are experiencing extreme or exceptional drought conditions. The image used in this report is provided by NOAA Climate.gov.

Climate models inspire hope for our planet's future

According to a recent analysis by the World Weather Attribution project, human-caused global warming played a much larger role than El Niño in intensifying the 2023 Amazon drought. This drought has resulted in many communities being cut off from food supplies, markets for their crops, and health services, causing electricity blackouts and water rationing in some urban areas.

Through observations and supercomputer model simulations, a team of experts found that global warming had doubled the precipitation deficits from El Niño alone. Rising temperatures have amplified water stress, turning the 2023 drought into an "exceptional" one that has become the worst on record. While the research has not yet been peer-reviewed, the team used methods that have previously passed peer-review. Rapid response analyses using these methods have been published in scientific journals, such as their analysis of the 2021 heatwave in the Pacific Northwest and their analysis of record-setting flooding in Louisiana in 2016.

The findings of this analysis underscore the critical importance of addressing the climate crisis to prevent future disasters from happening. This includes curbing deforestation and reforesting cleared and degraded areas in the Amazon to restore the region's moisture-recycling capacity. Reforestation would act as a buffer to global warming until the world can achieve net-zero greenhouse gas emissions.

The supercomputer model simulations have brought to light the impact of global warming and call for immediate action toward mitigating its impact. As the world works to reduce greenhouse gas emissions and stave off further warming, we must take collective action and make the world a better place for future generations. Although the path may seem long, it is possible with the right measures and collective action. The time for action is now!

This image, captured by the Hubble Space Telescope, displays a galaxy's powerful gravity embedded in a massive cluster of galaxies that forms multiple images of a single distant supernova behind it. The galaxy lies within a large cluster of galaxies called MACS  J1149.6+2223, which is more than 5 billion light-years away from Earth. In the zoomed-in view of the galaxy, the multiple copies of an exploding star named Supernova Refsdal are indicated by arrows. This supernova is located 9.3 billion light-years away from Earth. The image credit goes to NASA, ESA, and S. Rodney (JHU) and the FrontierSN team; T. Treu (UCLA), P. Kelly (UC Berkeley), and the GLASS team; J. Lotz (STScI) and the Frontier Fields team; M. Postman (STScI) and the CLASH team; and Z. Levay (STScI).
This image, captured by the Hubble Space Telescope, displays a galaxy's powerful gravity embedded in a massive cluster of galaxies that forms multiple images of a single distant supernova behind it. The galaxy lies within a large cluster of galaxies called MACS J1149.6+2223, which is more than 5 billion light-years away from Earth. In the zoomed-in view of the galaxy, the multiple copies of an exploding star named Supernova Refsdal are indicated by arrows. This supernova is located 9.3 billion light-years away from Earth. The image credit goes to NASA, ESA, and S. Rodney (JHU) and the FrontierSN team; T. Treu (UCLA), P. Kelly (UC Berkeley), and the GLASS team; J. Lotz (STScI) and the Frontier Fields team; M. Postman (STScI) and the CLASH team; and Z. Levay (STScI).

Unlock the secrets of the Universe with the help of cutting-edge data mining tools designed for use with the Roman Space Telescope

Researchers delving into one of the biggest enigmas of the universe - the speed at which it is expanding - are preparing to tackle this question in a novel manner through NASA's Nancy Grace Roman Space Telescope.

By May 2027, when it launches, astronomers will sift through the vast collection of images from Roman in search of gravitationally lensed supernovae. These observations can then be used to calculate the rate at which the universe is expanding.

Astronomers have various methods to determine the current expansion rate of the universe, which is also known as the Hubble constant. However, these different techniques have resulted in varying values, causing what is referred to as the "Hubble tension."

Roman's main focus will be on studying the enigmatic dark energy and its impact on the expansion of the universe. One of their key techniques for this will involve comparing the inherent brightness of objects such as type Ia supernovae with their observed brightness to calculate their distances. Another approach could be using Roman to analyze gravitationally lensed supernovae, which offers a distinct method for determining the Hubble constant based on geometric methods rather than just brightness comparisons.

According to Lou Strolger from the Space Telescope Science Institute (STScI) in Baltimore, who co-leads the team preparing for Roman's study of gravitationally lensed supernovae, "Roman is the perfect tool to advance our understanding of these objects." These supernovae are not only difficult to find, but also rare. We have had to rely on luck in detecting a few of them early enough. However, with Roman's wide field of view and high-resolution imaging capabilities, these chances will greatly improve."

Using advanced tools such as NASA's Hubble Space Telescope and James Webb Space Telescope, scientists have identified a total of only eight supernovae that are gravitationally lensed in the entire universe. However, out of those eight, only two have been suitable for accurately measuring the Hubble constant due to their specific type and the time it takes for their images to reach us. This phenomenon of light being bent by the strong gravitational forces of galaxies or clusters is known as gravitational lensing. This image is created using Hubble Space Telescope pictures of Supernova Refsdal. It shows how the gravity of a massive galaxy cluster, known as MACS J1149.6+2223, bends and focuses the light from the supernova behind it. As a result, multiple images of the exploding star are formed. When the star explodes, its light travels through space and encounters the foreground galaxy cluster. The cluster's gravity bends the light paths, which are then redirected onto new paths that point toward Earth. Astronomers observe multiple images of the exploding star, each corresponding to one of those altered light paths. Each image takes a different route through the cluster and arrives at a different time. The redirected light then passes through a giant elliptical galaxy within the cluster, which adds another layer of lensing. Credit goes to the illustration team consisting of NASA, ESA, A. Fields (STScI), and J. DePasquale (STScI). The science team includes NASA, ESA, S. Rodney (JHU) and the FrontierSN team, T. Treu (UCLA), P. Kelly (UC Berkeley), the GLASS team, J. Lotz (STScI) and the Frontier Fields team, M. Postman (STScI) and the CLASH team, and Z. Levay (STScI).

As the light from the supernova travels along various paths, it creates multiple images of itself in different locations in the sky. Due to differences in these paths, the images may appear delayed by varying amounts of time - anywhere from hours to months, or even years. By precisely measuring these differences in arrival times, we can determine a combination of distances that helps us understand the Hubble constant.

Using this unique method with the same observatory allows us to gain new insights into why different techniques have produced conflicting results, explained Justin Pierel, co-lead on the program alongside Strolger, both of whom are from STScI.

Roman's thorough surveys will accurately map the universe at a much faster rate than Hubble, as the new telescope can capture over 100 times more area in a single image. Instead of taking multiple pictures of individual trees, this advanced technology allows us to view the entire forest in one snapshot, Pierel explained enthusiastically.

Under the High Latitude Time Domain Survey, astronomers will repeatedly observe the same area of the sky, providing unique opportunities to study objects that change over time. This will result in an immense amount of data – more than 5 billion pixels in each observation – which must be carefully analyzed to identify rare events. Dr. Strolger and Dr. Pierel at STScI are leading a team funded by NASA's ROSES program to develop methods for detecting gravitationally lensed supernovae in data collected by the Nancy Grace Roman Space Telescope.

Pierel explained that the full potential of gravitationally lensed supernovae can only be realized with careful preparation. We must have all the necessary tools in place beforehand so that we do not squander valuable time sifting through large amounts of data.

A group of researchers from different NASA centers and universities across the nation will work together to complete this project. The preparation process will consist of multiple phases. First, the team will develop data reduction systems specifically for identifying gravitationally lensed supernovae in images captured by Roman. To effectively train these systems, the researchers will also generate simulated images as there are currently only 10,000 known lenses available for testing, but they require 50,000.

The data reduction pipelines developed by the team led by Strolger and Pierel will supplement existing pipelines designed to research dark energy using Type Ia supernovae. According to Strolger, Roman presents a unique opportunity to create a high-quality collection of gravitationally lensed supernovae. All our preparations leading up to this point will provide us with the necessary components to fully utilize the immense potential for cosmological studies.

The management of the Nancy Grace Roman Space Telescope falls under the responsibility of NASA's Goddard Space Flight Center in Greenbelt, Maryland. Other key players involved include NASA's Jet Propulsion Laboratory and Caltech/IPAC in Southern California, as well as the Space Telescope Science Institute in Baltimore. A team of scientists from different research institutions also contribute to the project. The primary companies involved in its development are Ball Aerospace and Technologies Corporation based in Boulder, Colorado; L3Harris Technologies located in Melbourne, Florida; and Teledyne Scientific & Imaging headquartered in Thousand Oaks, California.

Unveiling the wind farm conundrum: Supercomputer simulations cast doubt

Wind farms have been hailed as a promising source of renewable energy. However, new research by the University of British Columbia Okanagan (UBCO) and Delft University of Technology (TU Delft) in the Netherlands has raised concerns about their effectiveness. The researchers used supercomputer simulations to study the impact of wind farms on air patterns. Their findings have implications for wind farm productivity and the environment.

The researchers developed a modeling framework called the Toolbox for Stratified Convective Atmospheres (TOSCA) to study how wind farms affect the movement of air. They aimed to improve wind energy forecasts and increase productivity. However, when they examined how large wind farms impact natural wind patterns, they found that the results were not as positive as expected.

Dr. Joshua Brinkerhoff, an Associate Professor in UBCO's School of Engineering, explains that wind farms can alter the structure of incoming wind. This structure, known as the atmospheric boundary layer, monitors the wind's speed, temperature, and pressure at different altitudes. The researchers argue that wind farms' alteration of this layer has significant implications for their power output.

Dr. Brinkerhoff emphasizes the importance of proper wind farm design. Poorly designed wind farms can generate less power than expected, making them economically unviable. While software assists in the placement of turbines to maximize output, the researchers argue that their modeling framework is a valuable tool for engineers to design more effective wind farms.

However, skeptics argue that computer modeling may not capture the complex interactions between wind farms and the environment accurately. The lack of precision in estimating power production has significant financial repercussions for wind farm operators. The overestimation of energy output, a common issue not adequately captured by current models, becomes financially disastrous.

The research team acknowledges that their modeling framework, TOSCA, can help forecast the efficiency of wind farms during their establishment. Yet, critics contend that relying solely on simulated data to determine power estimates may not provide an accurate representation of real-world conditions. Skepticism remains regarding the translation of simulation results into practical outcomes.

Although supercomputer simulations represent an advancement in our understanding of wind farm dynamics, it is crucial to consider diverse perspectives on their effectiveness. The interaction between wind farms and the atmosphere is a complex phenomenon that requires a multidisciplinary approach, combining computational models with empirical studies and real-world data.

This research was supported by Mitacs Globalink, UL Renewables, and the Natural Science and Engineering Research Council of Canada. This research aimed to address the challenges facing wind energy. Computational resources were provided by the Digital Research Alliance of Canada and Advanced Research Computing at the University of British Columbia.

As the debate surrounding wind farms and their impact on the environment and energy production continues, it is clear that further research and a holistic understanding of these complex systems are required. Only through careful consideration of the limitations and uncertainties of supercomputer simulations can we arrive at truly sustainable solutions for our energy needs.