U.S. Secretary of Energy Steven Chu today announced the selection of 42 university-led research and development projects for awards totaling $38 million. These projects, funded over three to four years through the Department's Nuclear Energy University Program, will help advance nuclear education and develop the next generation of nuclear technologies.

"We are taking action to restart the nuclear industry as part of a broad approach to cut carbon pollution and create new clean energy jobs," said Secretary Chu. "These projects will help us develop the nuclear technologies of the future and move our domestic nuclear industry forward."

Twenty-three U.S. universities will act as lead research institutions for projects in 17 states. Other universities, industries, and national laboratories will serve as collaborators and research partners. The projects focus on four nuclear energy research areas:

Fuel Cycle Research and Development (13 projects, $11,823,154)
The goal of this research area is to research and demonstrate technologies that will enable the safe and cost-effective management of the used fuel produced by the current and future nuclear fuel cycle in a manner that reduces proliferation risk. The research conducted in the program is focused on developing novel technology options that will improve used fuel storage, recycling and disposal options, with performance in cost and environmental consequences significantly improved from current technology performance. Project awardees in this area are below. Actual project funding will be established during contract negotiation phase.
  • California State University, Long Beach - $1,390,252
  • Clemson University - $614,690
  • Drexel University - $1,149,327
  • Idaho State University - $650,000
  • Pennsylvania State University - $1,377,444
  • Rensselaer Polytechnic Institute - $810,141
  • University of Florida - $894,042
  • University of Michigan - $931,603
  • University of Michigan- $406,712
  • University of Missouri, Columbia - $541,286
  • University of Nevada, Las Vegas - $989,800
  • University of Wisconsin, Madison - $616,073
  • Washington State University - $1,451,784

Generation IV Reactor Research and Development (20 projects, $19,855,912)
The goal of this research area is to research and develop the next generation of nuclear reactors that will produce more energy and create less waste. The focus is developing new reactor technologies with higher safety, economic, and sustainability performance. The program will involve research on crosscutting technologies that will accelerate the development of advanced reactor concepts, including fuels, materials, and reactor modeling. The program also investigates small and medium-sized reactor concepts. If commercially successful, small modular reactors would significantly expand the options for nuclear power and its applications, and may prove advantageous compared to the Generation III+ nuclear plants in terms of economics, performance, and security. The research program is focused on the key technology challenges for these concepts and supports cross-cutting activities, including Modeling and Simulation, Structural Materials, Energy Conversion, Nuclear Instrumentation and Control, and Innovative Manufacturing Approaches. Project awardees in this area are below. Actual project funding will be established during contract negotiation phase.

  • Georgia Institute of Technology - $1,046,277
  • Idaho State University - $1,287,921
  • Johns Hopkins University - $1,183,239
  • The Ohio State University - $1,366,627
  • Pennsylvania State University - $1,000,000
  • Rensselaer Polytechnic Institute - $475,005
  • University of California, Berkeley - $1,320,667
  • University of California, Santa Barbara - $995,232
  • University of Cincinnati - $833,109
  • University of Michigan - $996,581
  • University of Michigan - $1,181,379
  • University of Minnesota - $1,366,163
  • University of Minnesota - $854,542
  • University of Missouri, Columbia - $703,064
  • University of Nevada, Las Vegas - $451,269
  • University of South Carolina - $1,366,626
  • University of Washington - $899,518
  • University of Wisconsin, Madison - $1,352,040
  • University of Wisconsin, Madison - $525,206
  • University of Wisconsin, Madison - $651,447

Light Water Reactor Sustainability (2 projects, $764,140)
The goal of this research area is to develop technologies and other solutions that can improve the reliability and sustain the safety of current reactors, and provide information to inform decisions on extending the life of current reactors. Research elements are focused on the understanding of fundamental aging and degradation behavior in reactor materials, creating improved inspection and monitoring technologies, fostering development of advanced fuels, and incorporating risk-informed, performance-based techniques in safety margin characterization and life-extension decision making. Project awardees in this area are below. Actual project funding will be established during contract negotiation phase.

  • Mississippi State University - $345,941
  • North Carolina State University - $418,199

Mission-Relevant Investigator-Initiated Research (7 projects, $5,556,816)
This research area focuses on creative, innovative, and "blue sky" research. This area includes research in the fields or disciplines of nuclear science and engineering such as, but not limited to, Nuclear Engineering, Nuclear Physics, Health Physics, Nuclear Materials Science, Radiochemistry or Nuclear Chemistry. Examples of topics of interest are new reactor designs and technologies; advanced fuel cycles, including advanced nuclear fuels; alternate aqueous and dry processes, including volatility and ionic liquids; instrumentation and control/human factors; radiochemistry; and fundamental nuclear science. Project awardees in this area are below. Actual project funding will be established during contract negotiation phase.

  • Idaho State University - $597,252
  • North Carolina State University - $1,129,304
  • Pennsylvania State University - $870,613 
  • University of California, Berkeley - $380,653
  • University of Cincinnati - $1,242,019
  • University of Michigan - $798,943
  • University of Wisconsin, Madison - $538,032

View the list of selected projects (pdf - 90 kb).
Learn additional information on the Nuclear Energy University Program.

Narrower constraints from the newest analysis aren’t quite narrow enough

 The international Supernova Cosmology Project (SCP), based at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, has announced the Union2 compilation of hundreds of Type Ia supernovae, the largest collection ever of high-quality data from numerous surveys. Analysis of the new compilation significantly narrows the possible values that dark energy might take—but not enough to decide among fundamentally different theories of its nature.

“We’ve used the world’s best-yet dataset of Type Ia supernovae to determine the world’s best-yet constraints on dark energy,” says Saul Perlmutter, leader of the SCP. “We’ve tightened in on dark energy out to redshifts of one”—when the universe was only about six billion years old, less than half its present age—“but while at lower redshifts the values are perfectly consistent with a cosmological constant, the most important questions remain.”

That’s because possible values of dark energy from supernovae data become increasingly uncertain at redshifts greater than one-half, the range where dark energy’s effects on the expansion of the universe are most apparent as we look farther back in time. Says Perlmutter of the widening error bars at higher redshifts, “Right now, you could drive a truck through them.”

As its name implies, the cosmological constant fills space with constant pressure, counteracting the mutual gravitational attraction of all the matter in the universe; it is often identified with the energy of the vacuum. If indeed dark energy turns out to be the cosmological constant, however, even more questions will arise.

“There is a huge discrepancy between the theoretical prediction for vacuum energy and what we measure as dark energy,” says Rahman Amanullah, who led SCP’s Union2 analysis; Amanullah is presently with the Oskar Klein Center at Stockholm University and was a postdoctoral fellow in Berkeley Lab’s Physics Division from 2006 to 2008. “If it turns out in the future that dark energy is consistent with a cosmological constant also at early times of the universe, it will be an enormous challenge to explain this at a fundamental theoretical level.”

A major group of competing theories posit a dynamical form of dark energy that varies in time. Choosing among theories means comparing what they predict about the dark energy equation of state, a value written w. While the new analysis has detected no change in w, there is much room for possibly significant differences in w with increasing redshift (written z).

“Most dark-energy theories are not far from the cosmological constant at z less than one,” Perlmutter says. “We’re looking for deviations in w at high z, but there the values are very poorly constrained.”

In their new analysis to be published in the Astrophysical Journal, “Spectra and HST light curves of six Type Ia supernovae at 0.511 < z < 1.12 and the Union2 compilation,” the Supernova Cosmology Project reports on the addition of several well-measured, very distant supernovae to the Union2 compilation. The paper is now available online at http://arxiv4.library.cornell.edu/abs/1004.1711.

Dark energy fills the universe, but what is it?

Dark energy was discovered in the late 1990s by the Supernova Cosmology Project and the competing High-Z Supernova Search Team, both using distant Type Ia supernovae as “standard candles” to measure the expansion history of the universe. To their surprise, both teams found that expansion is not slowing due to gravity but accelerating.

Other methods for measuring the history of cosmic expansion have been developed, including baryon acoustic oscillation and weak gravitational lensing, but supernovae remain the most advanced technique. Indeed, in the years since dark energy was discovered using only a few dozen Type Ia supernovae, many new searches have been mounted with ground-based telescopes and the Hubble Space Telescope; many hundreds of Type Ia’s have been discovered; techniques for measuring and comparing them have continually improved.

In 2008 the SCP, led by the work of team member Marek Kowalski of the Humboldt University of Berlin, created a way to cross-correlate and analyze datasets from different surveys made with different instruments, resulting in the SCP’s first Union compilation. In 2009 a number of new surveys were added.

The inclusion of six new high-redshift supernovae found by the SCP in 2001, including two with z greater than one, is the first in a series of very high-redshift additions to the Union2 compilation now being announced, and brings the current number of supernovae in the whole compilation to 557.

“Even with the world’s premier astronomical observatories, obtaining good quality, time-critical data of supernovae that are beyond a redshift of one is a difficult task,” says SCP member Chris Lidman of the Anglo-Australian Observatory near Sydney, a major contributor to the analysis. “It requires close collaboration between astronomers who are spread over several continents and several time zones. Good team work is essential.”

Union2 has not only added many new supernovae to the Union compilation but has refined the methods of analysis and in some cases improved the observations. The latest high-z supernovae in Union2 include the most distant supernovae for which ground-based near-infrared observations are available, a valuable opportunity to compare ground-based and Hubble Space Telescope observations of very distant supernovae.

Type Ia supernovae are the best standard candles ever found for measuring cosmic distances because the great majority are so bright and so similar in brightness. Light-curve fitting is the basic method for standardizing what variations in brightness remain: supernova light curves (their rising and falling brightness over time) are compared and uniformly adjusted to yield comparative intrinsic brightness. The light curves of all the hundreds of supernova in the Union2 collection have been consistently reanalyzed.

The upshot of these efforts is improved handling of systematic errors and improved constraints on the value of the dark energy equation of state with increasing redshift, although with greater uncertainty at very high redshifts. When combined with data from cosmic microwave background and baryon oscillation surveys, the “best fit cosmology” remains the so-called Lambda Cold Dark Matter model, or ΛCDM.

ΛCDM has become the standard model of our universe, which began with a big bang, underwent a brief period of inflation, and has continued to expand, although at first retarded by the mutual gravitational attraction of matter. As matter spread and grew less dense, dark energy overcame gravity, and expansion has been accelerating ever since.

To learn just what dark energy is, however, will first require scientists to capture many more supernovae at high redshifts and thoroughly study their light curves and spectra. This can’t be done with telescopes on the ground or even by heavily subscribed space telescopes. Learning the nature of what makes up three-quarters of the density of our universe will require a dedicated observatory in space.

1 of the largest-ever computer models explores dark matter and dark energy, 2 cosmic constituents that remain a mystery

Understanding dark energy is the number one issue in explaining the universe, according to Salman Habib, of the Laboratory's Nuclear and Particle Physics, Astrophysics and Cosmology group.

"Because the universe is expanding and at the same time accelerating, either there is a huge gap in our understanding of physics, or there is a strange new form of matter that dominates the universe – 'dark energy' – making up about 70 percent of it," said Habib. "In addition, there is five times more of an unknown 'dark matter' than there is ordinary matter in the universe, and we know it's there from many different observations, most spectacularly, we've seen it bend light in pictures from the Hubble Space Telescope, but its origin is also not understood."

Even though it's looking at only a small segment of the "accessible" universe, Habib's "Roadrunner Universe" model requires a petascale computer because, like the universe, it's mind-bendingly large. The model's basic unit is a particle with a mass of approximately one billion suns (in order to sample galaxies with masses of about a trillion suns), and it includes 64 billion and more of those particles.

The model is one of the largest simulations of the distribution of matter in the universe, and aims to look at galaxy-scale mass concentrations above and beyond quantities seen in state-of-the-art sky surveys.

"We are trying to really understand how to more completely and more accurately describe the observable universe, so we can help in the design of future experiments and interpret observations from ongoing observations like the Sloan Digital Sky Survey-III. We are particularly interested in the Large Synoptic Survey Telescope (LSST) in Chile, in which LANL is an institutional member, and DOE and NASA's Joint Dark Energy Mission (JDEM)," said Habib. "To do the science in any sort of reasonable amount of time requires a petascale machine at the least."

The Roadrunner Universe model relies on a hierarchical grid/particle algorithm that best matches the physical aspects of the simulation to the hybrid architecture of Roadrunner. Habib and his team wrote an entirely new computer code that aggressively exploits Roadrunner's hybrid architecture and makes full use of the PowerXCell 8i computational accelerators. They also created a dedicated analysis and visualization software framework to handle the huge simulation database.

"Our effort is aimed at pushing the current state of the art by three orders of magnitude in terms of computational and scientific throughput," said Habib. I'm confident the final database created by Roadrunner will be an essential component of dark universe science for years to come."

Researchers hope to increase effectiveness of methods increasingly needed for effective implementation of climate legislation and stabilization of carbon-equivalent trading markets

A federal agency has awarded two researchers at Scripps Institution of Oceanography, UC San Diego and colleagues $1.2 million to develop methods to quantify regional greenhouse gas emissions from atmospheric measurements, a capability expected to become more important as legislation to reduce global warming becomes more widespread and greenhouse gas emissions trading markets emerge.

Scripps geochemistry professors Ray Weiss and Ralph Keeling will join colleagues at Lawrence Livermore National Laboratory in Livermore, Calif. in a three-year project that will be based on continuous measurements of atmospheric greenhouse gases at two California locations. Supercomputer models at Lawrence Livermore will analyze these measurements to trace emissions of the gases back to their sources.


Funded by the National Institute of Standards and Technology (NIST), the science team will focus initially on industrially produced greenhouse gases not produced biologically that are used as refrigerants, solvents and in manufacturing processes that have limited types of sources and are relatively easy to trace. The method of using atmospheric measurements is often referred to as a "top-down" approach to emissions monitoring, as compared to "bottom-up" techniques that rely on estimates of emissions at their sources. Recent research by atmospheric scientists has shown that many gases are emitted to the atmosphere in concentrations significantly different from those estimated by "bottom-up" methods that are reported to regulatory agencies.

Weiss has likened the current lack of "top-down" verification to going on a diet without weighing oneself.

"The climate doesn't care what emissions we report. The climate only cares about what we actually emit," he said.

Verification of greenhouse gas emissions became a key issue between China and the United States during United Nations-led climate talks in Copenhagen in December.

Weiss said the newly funded research is a pilot program that will yield emissions "maps" focusing on California and western North America that could be expanded to other regions as measuring and modeling capabilities improve. Ray Weiss is a researcher at University of California - San Diego.

"Without a credible assessment of sources, you can't have a credible trading scheme," said Keeling. "Carbon markets today depend on a certain amount of hope and faith. Our aim is to make it possible to base these markets on solid numbers."

Ralph Keeling is a researcher at University of California - San Diego.

Carbon dioxide is the most significant greenhouse gas produced by human activities, contributing about twice the warming effect of the other major greenhouse gases combined. But for gases like CO2, methane and nitrous oxide that are heavily affected by natural and human-influenced biogenic processes, it is relatively difficult to disentangle natural from human-made emissions, so these gases will be studied in a later phase of the work, the researchers said.

Another reason to focus initially on industrial greenhouse gases is that these gases, many of which are thousands of times more potent per unit of emissions than CO2, play a disproportionately large role in carbon equivalent trading markets.

Atmospheric measurements are made at Scripps-operated stations located at Trinidad Head on the Northern California coast and on the Scripps campus in La Jolla, Calif. Both stations are part of NASA's Advanced Global Atmospheric Gases Experiment (AGAGE) network, of which Weiss is a principal investigator.

The work will require the capability to detect trace gases and an understanding of localized wind dynamics to create reliable computer representations of emissions activity, the researchers said. Joining Weiss and Keeling in the experiment are Philip Cameron-Smith and Donald Lucas at Livermore. The Livermore team will employ supercomputer models originally developed for emergency situations in which officials attempt to predict the spread of toxic emissions following industrial accidents. The team will run these models in reverse to locate and quantify greenhouse gas emissions.

NIST funded the project through its Measurement Science and Engineering Research Grants Program, which is made possible through the American Recovery and Reinvestment Act. The project includes a research component for making standards for measurements of trace gases.

NIST also awarded $1.5 million to Benson Shing, a structural engineering professor at the Jacobs School of Engineering at UCSD. Shing will seek to develop methods and improve design requirements for the seismic resistance design of shear walls in reinforced masonry buildings in efforts to enhance the cost-effectiveness and performance of these structures.

Mapping Darwinian evolutionary relationships results in an HIV family tree that may lead researchers to new vaccine focus areas

Supporting Los Alamos National Laboratory's role in the international Center for HIV/AIDS Vaccine Immunology (CHAVI) consortium, researchers are using the Roadrunner supercomputer to analyze vast quantities of genetic sequences from HIV infected people in the hope of zeroing in on possible vaccine target areas.

Physicist Tanmoy Bhattacharya and HIV researcher Bette Korber have used samples taken by CHAVI across the globe – from both chronic and acute HIV patients – and created an evolutionary genetic family tree, known as a phylogenetic tree, to look for similarities in the acute versus chronic sequences that may identify areas where vaccines would be most effective.

In this study the evolutionary history of more than 10,000 sequences from more than 400 HIV-infected individuals was compared.

The idea, according to Korber, is to identify common features of the transmitted virus, and attempt to create a vaccine that enables recognition the original transmitted virus before the body's immune response causes the virus to react and mutate.

"DNA Sequencing technology, however, is currently being revolutionized, and we are at the cusp of being able to obtain more than 100,000 viral sequences from a single person," said Korber. "For this new kind data to be useful, computational advances will have to keep pace with the experimental, and the current study begins to move us into this new era."

"The petascale supercomputer gives us the capacity to look for similarities across whole populations of acute patients," said Bhattacharya. "At this scale we can begin to figure out the relationships between chronic and acute infections using statistics to determine the interconnecting branches – and it is these interconnections where a specially-designed vaccine might be most effective.

Page 9 of 17