AI reveals hidden patterns of Yellowstone’s supervolcano

Beneath the stunning geysers and expansive landscapes of Yellowstone lies a hidden world of seismic activity, now revealed through advanced machine learning techniques.

A groundbreaking study led by Professor Bing Li at Western University in Canada, in collaboration with Universidad Industrial de Santander in Colombia and the U.S. Geological Survey, has utilized advanced machine learning on 15 years of seismic data from the Yellowstone caldera (2008–2022). The outcome? A seismic catalog of 86,276 earthquakes—nearly ten times more than previously recorded.

Machine Learning: The Seismic Detective 🔍

Historically, detecting earthquakes involved a labor-intensive process of manual review, where researchers would spend hours sifting through waveform data to identify seismic events. However, AI-powered algorithms have scanned the entire dataset, automatically identifying and determining magnitudes for previously overlooked small earthquakes.

Professor Li explains, “If we had to rely on traditional methods, where someone manually clicks through all this data, it’s not scalable.” Machine learning has not only accelerated detection but has also fundamentally transformed our understanding of the seismic patterns beneath Yellowstone.

Revealing Earthquake Swarms and Young Faults

More than half of the detected events were part of “earthquake swarms,” which are bursts of closely spaced small tremors. These swarms illustrate fractal-like fault structures—rough and immature fractures beneath the caldera. By mapping these features, scientists are gaining insights into how subsurface fluids trigger cascades of tremors.

This detailed seismic view enables researchers to apply robust statistical methods to analyze swarm dynamics and the interactions between fluids and faults in unprecedented detail.

As Li noted, these methods are not limited to Yellowstone; they have the potential to revolutionize monitoring at volcanoes worldwide.

In summary, machine learning is transforming Yellowstone from a breathtaking surface spectacle into a finely tuned seismic symphony. With AI guiding the way, scientists are now better equipped than ever to understand the hidden rhythms of our planet’s most famous supervolcano, turning once-silent tremors into enlightening discoveries.

Woolpert acquires Dawood Engineering, enhancing infrastructure, geospatial capabilities

Woolpert has made a significant move to expand its global presence and enhance its engineering and geospatial capabilities by acquiring Dawood Engineering Inc., a respected infrastructure and technology firm based in Pennsylvania. This acquisition brings over 150 engineers, surveyors, and geospatial professionals into Woolpert, bolstering its expertise in the transportation, utilities, and energy sectors across North America, Europe, and the Middle East.

Founded in 1992 by civil engineer Bony Dawood, Dawood Engineering has earned recognition for its multidisciplinary work in infrastructure development, particularly in transportation, utilities, and advanced geospatial solutions. Headquartered in Harrisburg, the firm has served as a lead consultant on major projects for state transportation departments and municipal agencies, including PennDOT, the Pennsylvania Turnpike Commission, and the cities of Boston and Philadelphia.

“We are incredibly proud to welcome the Dawood team to Woolpert,” said Woolpert President and CEO Neil Churman. “Their innovative and entrepreneurial approach aligns perfectly with our mission. This acquisition also strengthens our presence in Pennsylvania, a state at the forefront of AI, data center growth, and critical infrastructure development.”

Dawood Engineering has extensive experience in the energy and utilities sector, with expertise in oil and natural gas, electric utilities, pipeline design, and alternative energy sources. These capabilities enhance Woolpert’s existing portfolio and reinforce its position as a leader in integrated architecture, engineering, and geospatial (AEG) solutions.

“This next chapter, as part of the world’s leading AEG firm, creates an environment where all of our professionals can thrive,” said Dawood CEO Bony Dawood. “Together, we are well-positioned to help our clients shape the future of both digital and physical infrastructure.”

The acquisition also strengthens Woolpert with cutting-edge geospatial technologies, including Dawood’s work in 3D laser scanning, GIS, building information modeling, and its pioneering Twin Track mobile application for building management. Notable projects by Dawood include the $10 million Riverlands Safety Improvements Project and the digitization of Poland’s historic Royal Łazienki Museum.

Woolpert Infrastructure Sector Leader Bryan Dickerson emphasized the strategic value of the acquisition: “Dawood brings a depth of technical expertise that complements and strengthens our team. This partnership is founded on shared values and a common vision for innovation and excellence. For our clients and staff, it’s a transformative step forward.”

With the integration of Dawood’s team and services, Woolpert continues to build on its legacy as a global leader in infrastructure and geospatial services, now with an even stronger foundation rooted in Pennsylvania.

Birmingham modeling illuminates giants of the cosmos

How Cutting-Edge Simulations Helped Decode the Universe’s Heaviest Black Holes 🌌

In a landmark scientific achievement, astrophysicists at the University of Birmingham in the UK have played a pivotal role in unraveling the most massive black hole merger ever detected. Weighing in at an astonishing 240 solar masses, the binary system observed on November 23, 2023 defied expectations and set a new standard in cosmic discovery.

However, behind every gravitational wave lies an intricate dance of colossal forces. To decode this phenomenon, Birmingham researchers utilized supercomputer-powered modeling that advanced the field of computational astrophysics.

Precision Modeling: Turning Whispering Waves into Cosmic Stories

When the gravitational wave signals arrived, raw data alone could not reveal their whole story. Enter the unsung heroes: weeks-long supercomputer simulations that captured every detail of two black holes spinning at near-light speeds, tracing their spiraling embrace through the fabric of space-time. Observers noted that modeling such collisions can take weeks of supercomputer time.

These intensive simulations served a dual purpose: to generate theoretical templates of how black hole mergers ought to appear and to compare them with real signals to confirm the identity of the binary system. This intricate detective work unveiled the mass, spin, and orbital characteristics of these cosmic giants.

Birmingham’s Role: Expertise Meets Computational Power

A team of brilliant minds, including Dr. Amit Singh Ubhi, Dr. Debnandini Mukherjee, Dr. Panagiota Kolitsidou, and others, translated signature wave patterns into astrophysical revelations. Dr. Gregorio Carullo emphasized the importance of these models in uncovering layers of complexity, noting that it will take years for the community to unravel this intricate signal pattern fully.

By combining advanced numerical relativity, machine learning, and extensive computational time, Birmingham scientists confirmed a collision that challenges existing models of stellar physics and opens new avenues for understanding how black holes grow—and potentially collide again to form even larger entities.

Why It Matters: Simulating the Universe’s Most Violent Collisions

- Shattering Cosmic Records: The observed binary system outweighs the previous heavyweight by nearly 100 solar masses, prompting scientists to reconsider how such massive black holes form.
- Testing Einstein’s Legacy: Only through high-resolution simulations can researchers effectively probe general relativity under such extreme conditions.
- Fueling the Next Wave of Discoveries: Birmingham’s modeling framework will support future searches for intermediate-mass black holes, enigmatic objects that lie between stellar and supermassive scales.

Looking Ahead: Modeling the Future of Gravitational-Wave Astronomy

The supercomputer workflows developed around this groundbreaking observation are not just a one-time achievement; they represent a blueprint for future cosmic explorations. As gravitational-wave detectors evolve and become more sensitive, the modeling capabilities must also advance to interpret these signals. Birmingham’s team is at the forefront of this progression, combining computational strength with scientific insight.

🏅 Inspiring the Next Generation

What began as faint echoes in interplanetary space has become, through skill and computational power, a vivid chapter in the history of the cosmos. The supercomputer modeling done in Birmingham does not just process numbers; it brings them to life, showcasing humanity’s ability to simulate, understand, and appreciate the universe’s most dramatic events.

In doing so, these scientists remind us that we are not mere observers of the cosmos, we are its narrators, equipped with technology, intellect, and unwavering determination to tell its grandest stories.

Diamonds are hijacked: AI-powered simulations reveal surprising twist in crystal formation

In a stunning revelation about one of Earth's most iconic natural transformations, researchers at UC Davis have discovered that diamonds may owe their crystalline beauty to an unexpected detour involving graphite. This intriguing finding comes from cutting-edge molecular simulations powered by machine learning.

For decades, scientists have understood the basics: carbon atoms under immense pressure and heat eventually crystallize into diamonds. However, a new AI-assisted perspective has made this transformation story much more interesting.

Using advanced molecular dynamics simulations, the UC Davis team trained machine learning algorithms to model the atomic rearrangement that carbon undergoes deep within the Earth. Their results overturned previous assumptions: instead of carbon atoms seamlessly aligning into diamond form, they first transition into a more chaotic, graphite-like state. In other words, graphite — the same soft material found in pencils — serves as an unexpected intermediary in the creation of diamonds.

The simulations, which demanded extraordinary precision and computational power, revealed that this graphite-like layer "hijacks" the usual path to diamond formation. It creates a kind of atomic jam session that may appear messy on the surface but ultimately lays the foundation for the perfect diamond lattice.

"Without machine learning, we’d never have caught this," said UC Davis physicist and study co-author Subramanian Sankaranarayanan. "The simulations require immense computational complexity — we’re tracking the quantum behavior of thousands of atoms over time."

Traditional physics-based models would have taken years to run, but the team's AI-driven approach dramatically reduced that timeline. Their neural networks were trained on quantum-level data, enabling them to predict how atoms interact, bond, and break apart — all at unprecedented speeds and scales.

This discovery isn't just a scientific curiosity; it could lead to advancements in synthetic diamond technologies, providing cleaner, faster, and potentially cheaper methods for producing gem-quality diamonds or materials for advanced electronics.

As for the diamonds themselves? They may still be everlasting, but we now know that their journey includes a detour through pencil lead. Science is often full of surprises, and with the help of machine learning, we now understand that diamonds are born not only from pressure but also from a touch of chaos.

Chaopeng Shen and Yalan Song
Chaopeng Shen and Yalan Song

Supercomputer models may help prevent the next catastrophe, an expert says. AI simulations aim to improve flood warnings as the Central Texas tragedy deepens

The death toll from a devastating flash flood in Central Texas rose above 100 as of Monday evening, with officials reporting at least 104 confirmed fatalities and several dozen individuals still unaccounted for, including some 11 people from a single summer camp where 27 campers and staff are known to have died. Among the missing are children from Camp Mystic in Kerr County, where heavy rain and flash flooding washed away cabins and swept young lives into a ravaging river that surged 26 feet in just 45 minutes.

As grief-stricken communities search for answers and survivors, scientists at Penn State University are warning that without faster, more accurate flood forecasting systems, tragedies like these may repeat. In a breakthrough announced just days ago, a team led by Penn State civil and environmental engineers unveiled an AI-powered supercomputer model that significantly improves predictions of flood severity, location, and timing across the continental United States system referred to by its creators as a high‑resolution differentiable hydrologic and routing model combines decades of river‑gauge data, basin parameters, and weather observations with neural networks guided by physical hydrology. Traditional models, such as NOAA's National Water Model (NWM), require tedious calibration at each site, a process that can be highly inefficient and slow, particularly across thousands of river basins.

In contrast, the Penn State team's approach trains once on 15 years of streamflow data from 2,800 USGS stations, then deploys its learned network broadly, yielding 30 percent greater accuracy in streamflow forecasts across approximately 4,000 gauge stations, including those outside the training set. The model is exceptionally skilled at handling extreme rainfall events, avoiding the underestimation that pure machine learning models risk when encountering rare outliers.

The payoff is dramatic: tasks that once required weeks and multiple supercomputers can now be completed in hours on a single system. Simulating 40 years of high‑resolution flow data now takes mere hours—not weeks—potentially providing emergency managers crucial lead time before a flash flood strikes.

Pushback remains: integrating neural networks into operational systems, such as NOAA's NWM, demands independent validation and confidence in AI decision logic. Yet researchers emphasize that their "physics‑informed" hybrid design offers both superior speed and interpretability—a rare combination in flood forecasting technology.

A Nation Stunned by Swift Destruction

On the morning of July 4, Central Texas was struck by one of the deadliest floods in the state's history. Torrential storms deposited more than a foot of rain in fewer than 12 hours, saturating the western Guadalupe River basin. Overnight, the river rose at an alarming speed, sweeping away homes, cabins, vehicles, and lives in its path—particularly at Camp Mystic near Hunt, Texas.

Search and rescue teams deployed helicopters, boats, and drones in a desperate effort to find survivors, but time passed painfully as the death toll climbed past 100. Officials warned that the chance of finding more survivors was quickly fading. Grief and anger spread among families demanding better early warning systems—systems that might have prevented people from being in harm's way altogether.

Meeting the Moment with Supercomputing Power

The Penn State modeling initiative, supported by its Institute for Computational and Data Sciences (ICDS) and backed by leading universities and agencies (including NOAA and the Department of Energy), showcases how cutting‑edge supercomputing can accelerate flood risk understanding and preparedness across broad regions.

Chaopeng Shen and Yalan Song, the Penn State researchers co‑leading the effort, emphasize that beyond flood forecasting, their tool can help predict drought, soil moisture, groundwater recharge, and other hydrologic metrics vital for water resource management and agricultural resilience. Their model's ability to generalize across geographic regions makes it a promising candidate for integration into next-generation iterations of the National Water Model, potentially enhancing lead time and clarity in emergency alerts.

From Tragedy to Transformation

Central Texas mourns deeply as communities grapple with colossal loss—Camp Mystic staff and campers alone accounted for 27 deaths, with 11 individuals still missing, as of late Monday. Local families, responders, and officials have collapsed under the emotional and operational strain of a disaster that progressed too fast for conventional warning systems.

The Penn State model offers a glimmer of hope: a future where supercomputers and AI combine to give people time to evacuate or prepare—not just minutes, but possibly hours or days of advanced warning before floodwaters rise.

As disaster response continues in Texas, this dual narrative—of human tragedy and scientific promise—should prompt policymakers, funders, and technologists to ask: How can we accelerate the deployment of tools that could help prevent another flood from unfolding at such devastating speed?

The Road Ahead

The Penn State team is already in conversation with NOAA and other stakeholders to explore pilot deployments. However, widespread adoption will depend on validating performance in diverse geographies and demonstrating reliability under stress. The urgency to act has never been more apparent. As flood fatalities climb and the nation watches, harnessing the power of AI and supercomputing to predict and mitigate disaster is no longer hypothetical; it is imperative.