Supercomputers unlock the chemistry of gecko binding: Vienna team breaks new ground in modeling large molecules

Scientists at the Vienna University of Technology (TU Wien) have developed a high-precision computational approach to enhance the understanding of how large molecules interact, specifically through the weak but pervasive van der Waals forces that enable geckos to adhere to surfaces. This breakthrough is anticipated to drive advancements in materials science, pharmaceuticals, and energy storage by providing greater reliability in predicting molecular behavior.

A puzzle solved

For many years, researchers in quantum chemistry have relied on two prominent computational methods: the "gold standard" coupled-cluster theory, specifically CCSD(T), and the stochastic diffusion quantum Monte Carlo (DMC) method. While both methods have provided near-benchmark accuracy for small molecules, discrepancies in predicted interaction energies emerged when applied to large, highly polarizable molecular systems.
 
The TU Wien team, led by Prof. Andreas Grüneis, along with Tobias Schäfer, Andreas Irmler, and Alejandro Gallo, investigated this divergence. They identified that CCSD(T) systematically overestimated binding energies in large molecular complexes, predicting stronger molecular interactions than were actually present.
 
Their new computational variant, designated CCSD(cT), incorporates selected higher-order corrections to the treatment of triple particle-hole excitations, which are significant for large, polarizable systems. This refinement effectively mitigates over-binding and aligns the computed values with the DMC results. The authors demonstrate in their study that CCSD(cT) achieves "chemical accuracy" (within approximately 1 kcal/mol) even for complexes comprising over 100 atoms.

The super-computational method: what makes it special

The key to the breakthrough isn’t simply more powerful hardware, but a clever adaptation of computation techniques and basis sets that fully exploit today’s supercomputing infrastructure. The authors report three major enablers:
  1. Massive parallelization – The workflow was implemented on high-performance computing (HPC) clusters using up to 50 compute nodes (each with 128 cores) for their largest tasks. The ability to distribute the workload allowed the team to avoid many of the local‐correlation approximations that earlier coupled-cluster calculations used to save time but at the cost of accuracy.
  2. Plane-wave basis sets – Instead of the conventional Gaussian-type atom-centered orbitals, the team employed a plane-wave basis set (commonly used in solid-state physics) for large molecular complexes, along with natural‐orbital truncation and singular‐value decomposed Coulomb integral factorization. These choices allowed unbiased and systematically improvable estimates for the interaction energies and reduced basis‐set error.
  3. Refined triple-excitation correction (cT) – The heart of the improvement is the correction to CCSD(T)’s (T) approximation. Summary: (T) neglects certain diagrams—specifically terms like ([ [\hat V, \hat T_2 ], \hat T_2 ]), which are small for small, weakly polarizable molecules, but become significant when molecules are large and very polarizable. By including these terms in CCSD(cT), the team corrected the systematic over-binding of CCSD(T).
The method combines computational power with refined theory, effectively merging supercomputing and quantum chemistry. This "super-computational method" enables the reliable analysis of molecular systems that were previously too complex for theoretical models.

Why this matters: optimistic outlook

The implications are far-reaching:
  • Materials science & energy: Many next-generation materials, hydrogen storage media, novel catalysts, 2D materials, surfaces, rely on noncovalent interactions between large molecular or extended systems. Having accurate benchmark interaction energies means better design of materials from first principles. The TU Wien team note the importance for hydrogen binding energy prediction, drug crystallization, etc.
  • Pharmaceuticals & biomolecules: Large molecules with many atoms—think proteins, drug–target systems, and crystals—are now becoming accessible to reliable computational modeling. That means faster, smarter virtual screening, better understanding of how drugs bind, how crystals form, and more.
  • AI and machine learning models: Accurate benchmark data is the lifeblood of machine learning in chemical and materials modelling. The new method generates high‐quality reference data for large molecules, which can then train ML models for faster predictions down the line. (“Our results show that even well-established methods must be continuously re-examined to keep pace…” says the TU Wien release.
  • Science advancing: Perhaps most exciting is the idea that this demonstrates a new frontier: we are expanding the domain of accuracy in many-electron theory to ever larger systems. As the authors put it, “we are witnessing an unremitting expansion of the frontiers of accurate electronic structure theories to ever larger systems … which … has the potential to transform the paradigm of modern computational materials science.”
In short, the method opens doors. With ever-growing computational power and clever theoretical innovation, the old boundary of “accurate only for small molecules” is being lifted. That means more realistic modelling of real‐world systems, faster innovation in materials and biotech, and a hopeful horizon for computational science.

Looking ahead

Of course, challenges remain. The computations reported still required significant supercomputing resources (e.g., ~100k CPU hours for the benchmark coronene dimer), and the authors note that full canonical CCSD(cT) for still larger systems is not yet feasible—they use a fitted approximation (CCSD(cT)-fit) for the largest complexes they studied.
 
But the path forward is clear: local correlation approaches and low-scaling methods can inherit the improvements of CCSD(cT), bringing accuracy to more systems at lower cost. As the paper states, “The more accurate CCSD(cT) approximation can directly be transferred to computationally efficient low-scaling and local correlation approaches, which will substantially advance…”
 
In an optimistic note, the “gold standard” itself has been improved. The TU Wien team shows that even widely-trusted methods must evolve—and by making that evolution, they are advancing the entire field. As we explore ever more complex molecular systems, from new energy materials to advanced drugs, having reliable computational methods is not just helpful; it is essential. With this breakthrough, the future of computational chemistry and materials science looks brighter than ever.
A new theoretical study led by University of Delaware engineers reveals that magnons, a type of magnetic spin wave, can produce detectable electric signals. Pictured, Matt Doty, professor in the Department of Materials Science and Engineering, and postdoctoral researcher D. Quang To discuss their findings.
A new theoretical study led by University of Delaware engineers reveals that magnons, a type of magnetic spin wave, can produce detectable electric signals. Pictured, Matt Doty, professor in the Department of Materials Science and Engineering, and postdoctoral researcher D. Quang To discuss their findings.

Harnessing magnetism for faster computing

Envision a future where data transmission within computers occurs not only through electrons traversing wires, but also through waves that shimmer through the magnetic properties of materials. These waves carry information with significantly reduced waste, heat and offer increased potential. This intriguing prospect stems from the University of Delaware (UD) labs, where engineers have developed a novel method to detect and utilize magnetic waves for the next generation of high-speed computing.
 
Contemporary supercomputers, characterized by their extensive infrastructure processing climate models, genomic data, AI algorithms, and cryptographic tasks, are constrained by a prevalent bottleneck: the movement of electrons through wires, which generates resistance, heat, and ultimately, physical limitations. As explained by UD researchers, a significant portion of this delay arises from the continuous interaction between electric and magnetic subsystems, involving the magnetic storage of data and its electrical conveyance, a back-and-forth process.
 
A recent theoretical study demonstrates that magnetic waves, specifically magnons, which are collective oscillations of electron spin, can generate measurable electrical signals in antiferromagnetic materials. The key finding is that in these materials, the electron spins alternate direction (resulting in zero total magnetization); however, the wave-like fluctuations or wobbling of these spins can induce electric polarization. In essence, altering the magnetic properties results in an electrical response.
 
The significance of this research for supercomputing lies in the pursuit of ultra-fast and energy-efficient computing, exemplified by supercomputers and future quantum-hybrid systems. The ability to transfer and process information with minimal heat generation and maximal speed is paramount. The University of Delaware's (UD) findings present three key advantages: reduced energy waste through magnon-based spin orientation transmission, avoiding the resistance and heat losses inherent in conventional wiring; ultra-fast propagation of magnons in antiferromagnetic materials, achieving terahertz frequencies, which is significantly faster than in ferromagnets, providing substantial speed enhancements within processors and between components; and direct magneto-electric coupling, where a magnon's orbital angular momentum interacts with atoms, inducing electric polarization, thereby enabling the control of magnetic waves through electric or optical fields, creating faster, reconfigurable logic channels based on spin waves. In essence, the potential exists to replace electron-based wired systems with "spin-waves" transmitted via magnetic channels, resulting in faster, cooler, and more compact designs. For supercomputing, this could lead to denser rack configurations, increased computational capacity per watt, and novel architectures that integrate logic and memory more seamlessly.
 
The study utilized computer simulations, led by Matthew Doty from the University of Delaware's Materials Science & Engineering Department, to investigate magnon behavior in antiferromagnets under a temperature gradient. The research examined how the orbital angular momentum (a circular spin-wave motion) of magnons interacts with the atomic structure, generating electric polarization.
 
The model demonstrates that when a temperature difference exists across the material, causing magnons to flow, the orbital angular momentum of these magnons interacts with the material's atoms, producing a measurable voltage. This voltage represents the electrical signal generated by pure spin-wave propagation. Future research will focus on experimental validation of the simulations and exploration of the potential for light or electric fields to control magnon transport. This work is also being integrated within the Center for Hybrid, Active and Responsive Materials (CHARM) at UD, with the aim of developing hybrid quantum materials for terahertz applications.

Looking Ahead: Implications for Supercomputers

While currently in the theoretical and simulation stages, this research presents intriguing questions regarding the potential evolution of supercomputers:
  • Could future computational nodes transmit information via magnon waveguides, instead of copper or optical wires? This could lead to reduced cooling requirements and simplified wiring.
  • Could logic and memory become more intimately integrated, with magnetic channels performing computation and data storage simultaneously?
  • Might this facilitate terahertz-clocked compute fabrics, where internal signaling occurs at orders of magnitude greater speeds than current gigahertz semiconductor circuits?
How will manufacturing challenges be addressed, such as creating antiferromagnetic materials, integrating spin-wave channels with conventional electronics, and scaling to millions of such channels?
 
For the supercomputing field, where every fraction of a second and every watt of power is critical, this research is akin to discovering a new data highway, one that could bypass current congested routes. This does not imply that the current "silicon-electron wire" paradigm will disappear overnight, but it does suggest that a paradigm shift may be forthcoming.

Final Thoughts

There is a compelling metaphor in the research: that a magnon is "just like that: a wave" traveling down a slinky of spins. It is both playful and imaginative, yet rooted in rigorous simulation and physics. In high-end computing, where imagination often precedes engineering, the question now is: how rapidly can this playfulness be translated into prototypes, chips, and novel architectures?
 
If engineers successfully transform magnons into usable signal carriers within supercomputers, we may soon discuss "spin-wave supercomputing" with the same level of confidence as we currently use the term "silicon chip." The bottleneck between magnetic storage and electrical processing may finally begin to diminish.
 
This research warrants attention; it is both intriguing and innovative, and it may revolutionize the way we compute.

Climate whiplash approaches: A warning from supercomputer simulations

Recent research conducted by the Institute for Basic Science (IBS), a Korean government-funded research institute, and its collaborators has raised significant concerns regarding the behavior of the global climate system in the forthcoming decades. The study indicates that the fluctuations within the recurring cycles of the El Niño–Southern Oscillation (ENSO) may intensify in amplitude and become more regular and interconnected with other significant climate patterns. The implications of these high-resolution supercomputer simulation outcomes are noteworthy, suggesting a transition from irregular, loosely linked climate oscillations to a more synchronized and amplified system. This shift represents not merely an increase in extreme weather events but a fundamental alteration of the climate patterns to which humanity has become accustomed.

What the simulations show

The research team employed a state-of-the-art climate model (AWI‑CM3) with a horizontal resolution of approximately 31 km in the atmosphere and 4-25 km in the ocean. Under a high-greenhouse-gas scenario, the model projected that by around mid-century (2060s), the ENSO cycle will undergo an abrupt transition: The amplitude of sea-surface temperature fluctuations in the tropical eastern Pacific will increase markedly.
 
The cycles will become more regular instead of erratic, in effect, the “irregular rhythm” of El Niño/La Niña will give way to a more predictable but much stronger oscillation. Other major climate modes, such as the Indian Ocean Dipole (IOD), the North Atlantic Oscillation (NAO), and the tropical North Atlantic mode (TNA), are projected to synchronize their behavior with ENSO, a kind of "resonance" between climate subsystems.
 
In simple terms, the study suggests a potential shift towards a climate regime in which the tropical Pacific, Indian Ocean, and Atlantic oscillations all begin to ‘swing in step,’ amplifying rainfall and temperature extremes in connected regions around the world.

Why we should worry:

The shift projected by the simulations is not merely academic. The researchers highlight that this amplified and synchronized behavior could create “hydroclimate whiplash,” rapid transitions between flood and drought, intense storms followed by extended dry spells in vulnerable regions such as Southern California and the Iberian Peninsula. Such whiplash events challenge existing adaptation/confidence strategies, infrastructure planning, agriculture, and water resource management.
 
The study’s authors emphasize that while a more regular oscillation might, in principle, facilitate forecasting, the magnitude of the impacts will demand far more robust preparedness.

Key takeaway: a scientific red flag

The distinguishing characteristic of this research lies in the clarity and specificity of its supercomputer simulation outcomes. These findings represent a departure from general end-of-century projections, as the model indicates an impending shift occurring within the next few decades. The authors characterize this change as an "abrupt transition." Consequently, rather than a gradual deterioration, we may be confronted with a tipping-point scenario: transitioning from a period of moderate, irregular ENSO fluctuations to one characterized by robust, regular, synchronized oscillations and heightened impacts.
 
Immediate action is required from policymakers and infrastructure planners, given the prospect of increasingly extreme and predictable climate fluctuations. Adaptation strategies must evolve from addressing isolated extreme events to proactively anticipating a fundamentally altered climatic pattern. International collaboration is paramount, as the synchronization of these climate modes will result in global repercussions, extending beyond regional impacts. Furthermore, sustained high-resolution modeling is essential to enhance predictive accuracy, particularly concerning regional effects. The study itself acknowledges ongoing advancements in high-resolution simulations at IBS's supercomputing facility.
 
In conclusion, the simulations presented by IBS and its collaborators introduce a concerning possibility: a transformation of the Earth's climate system into a new "swing mode," characterized by accelerated, more frequent, and synchronized extreme variations across ocean basins. The window for preparedness is diminishing. Should the model's projections materialize, the world will confront not merely intensified weather events, but a fundamentally altered tempo of climate variability, representing a challenge of potentially unprecedented magnitude.