If humans could harness nuclear fusion, the process that powers stars like our Sun, the world could have an inexhaustible energy source. In theory, scientists could produce a steady stream of fusion energy on Earth by heating up two types of hydrogen atoms—deuterium and tritium—to more than 100 million degrees centigrade until they become a gaseous stew of electrically charged particles, called plasma. Then use powerful magnets to compress these particles until they fuse together, releasing energy in the process. This image shows the initial stage of a large edge instability in the DIII-D tokamak. Top row shows contours of the plasma temperature in a cross section of the torus with its central axis to the left. Bottom row shows corresponding density. The vacuum region between plasma and wall is grey; plasma expands rapidly into this region and hits the outer wall, then gradually subsides back to its original shape.

Although magnetic fusion has been achieved on Earth, researchers still do not understand the behavior of plasma well enough to effectively confine it to generate a sustainable flow of energy. That's where the U.S. Department of Energy (DOE) comes in. Over the past 50 years the Energy Department has invested significantly in fusion research and even established a supercomputing center to simulate the behavior of plasma in a fusion reactor. This facility eventually evolved into the National Energy Research Scientific Computing Center (NERSC), which is currently managed by the Lawrence Berkeley National Laboratory and serves about 3,000 researchers annually.

The DOE's Scientific Discovery Through Advance Computing (SciDAC) Center for Extended Magnetohydrodynamic Modeling has also played a vital role in developing computer codes for fusion modeling. Recently, the collaboration created an extension to the Multilevel 3D (M3D) computer code that allows researchers to simulate what happens when charged particles are ejected from a hot plasma stew and splatter on the walls of a "tokamak" devices, the doughnut-shaped "pot" used to magnetically contain plasmas. These simulations are vital for ensuring the safety of future plasma containment devices to achieve sustainable fusion for electricity.

Using NERSC computers, the team used the M3D extension to simulate the development of the magnetic fields in a tokamak fusion experiment; identify and dissect points where the plasma becomes chaotic; and map the trajectory of high-energy plasma eruptions from these so called X-points to determine how these particles affect the tokamak walls.

According to Linda Sugiyama of the Massachusetts Institute of Technology, who led the code development, these findings will be critical for ensuring the safety and success of upcoming magnetic confinement fusion experiments like ITER, which is currently under construction in southern France and will attempt to create 500 megawatts of fusion power for several-minute stretches.


In this image, the torus of plasma is tilted upward to show the bottom. The magnetic fields within the plasma are represented as white lines and approximately follow a contour of constant plasma temperature as it circles around the torus. The temperature is shown as blue/yellow surface.

Modeling Fusion to Ensure Safety

Because hot plasma is extremely dangerous, it is imperative that researchers understand plasma instability so that they can properly confine it. For fusion to occur, plasmas must be heated to more than 100 million degrees centigrade. At these searing temperatures, the material is too hot to contain with most earthly materials; the charged particles can only be held in place with magnetic fields generated by devices like the tokamak.

In the tokamak, plasma is confined to a torus, or doughnut shape. However, Sugiyama's NERSC simulations show that that instabilities at the plasma edge can alter the constraining magnetic fields generated by the tokamak, allowing blobs of plasma to break out of the torus shape. Instead of a single crash, this phenomenon can cause multiple pulses of instability, each ejecting large blobs of plasma onto the tokamak walls. The number of unstable pulses affects how much plasma is thrown onto the walls, which in turn determines the extent of wall damage.

"Studies of nonlinear dynamics show that if you have X-points on the plasma boundary, which is standard for high temperature fusion plasmas, particles can fly out and hit the walls. But no one had ever seen what actually happens if you let the plasma go," says Sugiyama. "This is the first time that we have seen a stochastic magnetic tangle, a structure well known from Hamiltonian chaos theory, generated by the plasma itself. Its existence also means that we will have to rethink some of our basic ideas about confined plasmas."

She notes that the primary causes of plasma instability vary in the different plasma regions of the tokamak, including the core, edge and surrounding vacuum. To understand the most dangerous of these instabilities, computational physicists model each plasma region on short time-and-space scales. The plasma regions are strongly coupled through the magnetic and electric fields, and researchers must carry out an integrated simulation that shows how all the components interact in the big picture, over long time scales.

"Modeling plasma instabilities is computationally challenging because these processes occur on widely differing scales and have unique complexities," says Sugiyama. "This phenomenon couldn't be investigated without computers. Compared to other research areas, these simulations are not computationally large. The beauty of using NERSC is that I can run my medium-size jobs for a long time until I generate all the time-steps I need to see the entire process accurately."

She notes that the longest job ran on 360 processors for 300 hours on NERSC's "Franklin" system. However, she also ran numerous other jobs on the facility's Franklin and DaVinci systems using anywhere from 432 to 768 processors for about 200 CPU hours.

"I greatly appreciate the NERSC policy of supporting the work required to scale up from small jobs to large jobs, such as generating input files and visualizing the results. The center's user consultants and analytics group were crucial to getting these results," says Sugiyama.

A paper based on her NERSC results was recently accepted by Physics of Plasmas and will appear in print later this year. Sugiyama will also present her findings at the SciDAC 2010 conference in Chattanooga, Tenn. in July. Steve Jardin of the Princeton Plasma Physics Laboratory is the principal investigator of the SciDAC Center for Extended Magnetohydrodynamic Modeling, which supported this work.

For more information about NERSC's legacy of supporting fusion research read:
The Fusion Two-Step
Modeling Microturbulence In Fusion Plasmas
Hail Storms in Hell
ITER Design Basis Plasma Disruption Simulations
Oil crisis stalled cars, but jumpstarted a supercomputing revolution
2004 NERSC Annual Report- Fusion Energy Sciences
2003 NERSC Annual Report- Fusion Energy Sciences
2002 NERSC Annual Report- Fusion Energy Sciences
2001 NERSC Annual Report- Fusion Energy Sciences
2000 NERSC Annual Report- Fusion Energy Sciences
1999 NERSC Annual Report- Fusion Energy Sciences
1998 NERSC Annual Report- Fusion Energy Sciences

With a network of more than 5,000 sensors that monitor weather conditions, seismic activity, traffic, bacteria on beaches, water levels and much more, Sensorpedia is a significant resource that continues to expand.

Sensorpedia, developed three years ago by Oak Ridge National Laboratory's Bryan Gorman and David Resseguie, connects first responders, individuals and communities with online sensor data in the United States and beyond.

"Sensorpedia combines the best of Facebook and YouTube and continues to expand and evolve to meet the demands and needs of users," Resseguie said.

Sneak peeks of Sensorpedia, which is in beta testing, are available at http://www.sensorpedia.com. Funding for Sensorpedia is provided by the Department of Homeland Security-sponsored Southeast Region Research Initiative.

A theoretical technique developed at the Department of Energy's Oak Ridge National Laboratory is bringing supercomputer simulations and experimental results closer together by identifying common "fingerprints." 

ORNL's Jeremy Smith collaborated on devising a method -- dynamical fingerprints -- that reconciles the different signals between experiments and computer simulations to strengthen analyses of molecules in motion. The research will be published in the Proceedings of the National Academy of Sciences. 

 As a molecule jumps between structural states (below), it creates

"Experiments tend to produce relatively simple and smooth-looking signals, as they only 'see' a molecule's motions at low resolution," said Smith, who directs ORNL's Center for Molecular Biophysics and holds a Governor's Chair at the University of Tennessee. "In contrast, data from a supercomputer simulation are complex and difficult to analyze, as the atoms move around in the simulation in a multitude of jumps, wiggles and jiggles. How to reconcile these different views of the same phenomenon has been a long-standing problem." 

The new method solves the problem by calculating peaks within the simulated and experimental data, creating distinct "dynamical fingerprints." The technique, conceived by Smith's former graduate student Frank Noe, now at the Free University of Berlin, can then link the two datasets. 

Supercomputer simulations and modeling capabilities can add a layer of complexity missing from many types of molecular experiments. 

"When we started the research, we had hoped to find a way to use computer simulation to tell us which molecular motions the experiment actually sees," Smith said. "When we were finished we got much more -- a method that could also tell us which other experiments should be done to see all the other motions present in the simulation. This method should allow major facilities like the ORNL's Spallation Neutron Source to be used more efficiently." 

Combining the power of simulations and experiments will help researchers tackle scientific challenges in areas like biofuels, drug development, materials design and fundamental biological processes, which require a thorough understanding of how molecules move and interact. 

"Many important things in science depend on atoms and molecules moving," Smith said. "We want to create movies of molecules in motion and check experimentally if these motions are actually happening." 

View a supercomputer simulation of a protein in motion here: http://www.ornl.gov/ornlhome/hg_mer.htm 

"The aim is to seamlessly integrate supercomputing with the Spallation Neutron Source so as to make full use of the major facilities we have here at ORNL for bioenergy and materials science development," Smith said. 

The collaborative work included researchers from L'Aquila, Italy, Wuerzburg and Bielefeld, Germany, and the University of California at Berkeley. The research was funded in part by a Scientific Discovery through Advanced Computing grant from the DOE Office of Science. 

Researchers hope to increase effectiveness of methods increasingly needed for effective implementation of climate legislation and stabilization of carbon-equivalent trading markets

A federal agency has awarded two researchers at Scripps Institution of Oceanography, UC San Diego and colleagues $1.2 million to develop methods to quantify regional greenhouse gas emissions from atmospheric measurements, a capability expected to become more important as legislation to reduce global warming becomes more widespread and greenhouse gas emissions trading markets emerge.

Scripps geochemistry professors Ray Weiss and Ralph Keeling will join colleagues at Lawrence Livermore National Laboratory in Livermore, Calif. in a three-year project that will be based on continuous measurements of atmospheric greenhouse gases at two California locations. Supercomputer models at Lawrence Livermore will analyze these measurements to trace emissions of the gases back to their sources.


Funded by the National Institute of Standards and Technology (NIST), the science team will focus initially on industrially produced greenhouse gases not produced biologically that are used as refrigerants, solvents and in manufacturing processes that have limited types of sources and are relatively easy to trace. The method of using atmospheric measurements is often referred to as a "top-down" approach to emissions monitoring, as compared to "bottom-up" techniques that rely on estimates of emissions at their sources. Recent research by atmospheric scientists has shown that many gases are emitted to the atmosphere in concentrations significantly different from those estimated by "bottom-up" methods that are reported to regulatory agencies.

Weiss has likened the current lack of "top-down" verification to going on a diet without weighing oneself.

"The climate doesn't care what emissions we report. The climate only cares about what we actually emit," he said.

Verification of greenhouse gas emissions became a key issue between China and the United States during United Nations-led climate talks in Copenhagen in December.

Weiss said the newly funded research is a pilot program that will yield emissions "maps" focusing on California and western North America that could be expanded to other regions as measuring and modeling capabilities improve. Ray Weiss is a researcher at University of California - San Diego.

"Without a credible assessment of sources, you can't have a credible trading scheme," said Keeling. "Carbon markets today depend on a certain amount of hope and faith. Our aim is to make it possible to base these markets on solid numbers."

Ralph Keeling is a researcher at University of California - San Diego.

Carbon dioxide is the most significant greenhouse gas produced by human activities, contributing about twice the warming effect of the other major greenhouse gases combined. But for gases like CO2, methane and nitrous oxide that are heavily affected by natural and human-influenced biogenic processes, it is relatively difficult to disentangle natural from human-made emissions, so these gases will be studied in a later phase of the work, the researchers said.

Another reason to focus initially on industrial greenhouse gases is that these gases, many of which are thousands of times more potent per unit of emissions than CO2, play a disproportionately large role in carbon equivalent trading markets.

Atmospheric measurements are made at Scripps-operated stations located at Trinidad Head on the Northern California coast and on the Scripps campus in La Jolla, Calif. Both stations are part of NASA's Advanced Global Atmospheric Gases Experiment (AGAGE) network, of which Weiss is a principal investigator.

The work will require the capability to detect trace gases and an understanding of localized wind dynamics to create reliable computer representations of emissions activity, the researchers said. Joining Weiss and Keeling in the experiment are Philip Cameron-Smith and Donald Lucas at Livermore. The Livermore team will employ supercomputer models originally developed for emergency situations in which officials attempt to predict the spread of toxic emissions following industrial accidents. The team will run these models in reverse to locate and quantify greenhouse gas emissions.

NIST funded the project through its Measurement Science and Engineering Research Grants Program, which is made possible through the American Recovery and Reinvestment Act. The project includes a research component for making standards for measurements of trace gases.

NIST also awarded $1.5 million to Benson Shing, a structural engineering professor at the Jacobs School of Engineering at UCSD. Shing will seek to develop methods and improve design requirements for the seismic resistance design of shear walls in reinforced masonry buildings in efforts to enhance the cost-effectiveness and performance of these structures.

Page 45 of 45