Uncategorised

1 of the largest-ever computer models explores dark matter and dark energy, 2 cosmic constituents that remain a mystery

Understanding dark energy is the number one issue in explaining the universe, according to Salman Habib, of the Laboratory's Nuclear and Particle Physics, Astrophysics and Cosmology group.

"Because the universe is expanding and at the same time accelerating, either there is a huge gap in our understanding of physics, or there is a strange new form of matter that dominates the universe – 'dark energy' – making up about 70 percent of it," said Habib. "In addition, there is five times more of an unknown 'dark matter' than there is ordinary matter in the universe, and we know it's there from many different observations, most spectacularly, we've seen it bend light in pictures from the Hubble Space Telescope, but its origin is also not understood."

Even though it's looking at only a small segment of the "accessible" universe, Habib's "Roadrunner Universe" model requires a petascale computer because, like the universe, it's mind-bendingly large. The model's basic unit is a particle with a mass of approximately one billion suns (in order to sample galaxies with masses of about a trillion suns), and it includes 64 billion and more of those particles.

The model is one of the largest simulations of the distribution of matter in the universe, and aims to look at galaxy-scale mass concentrations above and beyond quantities seen in state-of-the-art sky surveys.

"We are trying to really understand how to more completely and more accurately describe the observable universe, so we can help in the design of future experiments and interpret observations from ongoing observations like the Sloan Digital Sky Survey-III. We are particularly interested in the Large Synoptic Survey Telescope (LSST) in Chile, in which LANL is an institutional member, and DOE and NASA's Joint Dark Energy Mission (JDEM)," said Habib. "To do the science in any sort of reasonable amount of time requires a petascale machine at the least."

The Roadrunner Universe model relies on a hierarchical grid/particle algorithm that best matches the physical aspects of the simulation to the hybrid architecture of Roadrunner. Habib and his team wrote an entirely new computer code that aggressively exploits Roadrunner's hybrid architecture and makes full use of the PowerXCell 8i computational accelerators. They also created a dedicated analysis and visualization software framework to handle the huge simulation database.

"Our effort is aimed at pushing the current state of the art by three orders of magnitude in terms of computational and scientific throughput," said Habib. I'm confident the final database created by Roadrunner will be an essential component of dark universe science for years to come."

Tags:

Mapping Darwinian evolutionary relationships results in an HIV family tree that may lead researchers to new vaccine focus areas

Supporting Los Alamos National Laboratory's role in the international Center for HIV/AIDS Vaccine Immunology (CHAVI) consortium, researchers are using the Roadrunner supercomputer to analyze vast quantities of genetic sequences from HIV infected people in the hope of zeroing in on possible vaccine target areas.

Physicist Tanmoy Bhattacharya and HIV researcher Bette Korber have used samples taken by CHAVI across the globe – from both chronic and acute HIV patients – and created an evolutionary genetic family tree, known as a phylogenetic tree, to look for similarities in the acute versus chronic sequences that may identify areas where vaccines would be most effective.

In this study the evolutionary history of more than 10,000 sequences from more than 400 HIV-infected individuals was compared.

The idea, according to Korber, is to identify common features of the transmitted virus, and attempt to create a vaccine that enables recognition the original transmitted virus before the body's immune response causes the virus to react and mutate.

"DNA Sequencing technology, however, is currently being revolutionized, and we are at the cusp of being able to obtain more than 100,000 viral sequences from a single person," said Korber. "For this new kind data to be useful, computational advances will have to keep pace with the experimental, and the current study begins to move us into this new era."

"The petascale supercomputer gives us the capacity to look for similarities across whole populations of acute patients," said Bhattacharya. "At this scale we can begin to figure out the relationships between chronic and acute infections using statistics to determine the interconnecting branches – and it is these interconnections where a specially-designed vaccine might be most effective.

Tags:

Government-wide emphasis on community access to data supports substantive push toward more open

During the May 5th meeting of the National Science Board, National Science Foundation (NSF) officials announced a change in the implementation of the existing policy on sharing research data. In particular, on or around October, 2010, NSF is planning to require that all proposals include a data management plan in the form of a two-page supplementary document. The research community will be informed of the specifics of the anticipated changes and the agency's expectations for the data management plans.

The changes are designed to address trends and needs in the modern era of data-driven science.

"Science is becoming data-intensive and collaborative," noted Ed Seidel, acting assistant director for NSF's Mathematical and Physical Sciences directorate. "Researchers from numerous disciplines need to work together to attack complex problems; openly sharing data will pave the way for researchers to communicate and collaborate more effectively."

"This is the first step in what will be a more comprehensive approach to data policy," added Cora Marrett, NSF acting deputy director. "It will address the need for data from publicly-funded research to be made public."

Seidel acknowledged that each discipline has its own culture about data-sharing, and said that NSF wants to avoid a one-size-fits-all approach to the issue. But for all disciplines, the data management plans will be subject to peer review, and the new approach will allow flexibility at the directorate and division levels to tailor implementation as appropriate.

This is a change in the implementation of NSF's long-standing policy that requires grantees to share their data within a reasonable length of time, so long as the cost is modest.

"The change reflects a move to the Digital Age, where scientific breakthroughs will be powered by advanced computing techniques that help researchers explore and mine datasets," said Jeannette Wing, assistant director for NSF's Computer & Information Science & Engineering directorate. "Digital data are both the products of research and the foundation for new scientific insights and discoveries that drive innovation."

NSF has a variety of initiatives focused on advancing the vision of data-intensive science. The issue is central to NSF's Sustainable Digital Data Preservation and Access Network Partners (DataNet) program in the Office of Cyberinfrastructure.

"Twenty-first century scientific inquiry will depend in large part on data exploration," said José Muñoz, acting director of the Office of Cyberinfrastructure. "It is imperative that data be made not only as widely available as possible but also accessible to the broad scientific communities."

Seidel noted that requiring the data management plans was consistent with NSF's mission and with the growing interest from U.S. policymakers in making sure that any data obtained with federal funds be accessible to the general public. Along with other federal agencies, NSF is subject to the Open Government Directive, an effort of the Obama administration to make government more transparent and more participatory.

Tags:

Scientists use the Roadrunner supercomputer to model a fundamental process in physics that could help explain how stars begin to explode into supernovae

Despite decades of research, understanding turbulence, the seemingly random motion of fluid flows, remains one of the major unsolved problems in physics.

“With the Roadrunner supercomputer, we can now look in detail at previously inaccessible flows,” said Daniel Livescu, of Laboratory’s Computational Physics and Methods group.  Involving a technique known as Direct Numerical Simulations (DNS), researchers use the exact equations of fluid flow to calculate pressures, densities, and velocities, at very high resolution for both time and space, high enough to resolve the smallest eddies in the turbulent flow. This makes the DNS results as “real” as experimental data but requires immense computer power.

In many instances, these simulations are the only way turbulence properties such as those found in cosmic explosions like supernovae can be accurately probed.  In these cases, turbulence is accompanied by additional phenomena such as exothermic reactions, shock waves, and radiation, which drastically increase the computational requirements.

Livescu and colleague Jamaludin Mohd-Yusof of the Laboratory’s Computational Physics and Methods group are using Roadrunner and a high performance Computational Fluid Dynamics code to perform the largest turbulent reacting flow simulations to date. The simulations consider the conditions encountered in the early stages of what is known as a “type Ia” supernova, which results from the explosion of a white dwarf star.

Type Ia supernovae have become a standard in cosmology due to their role in measuring the distances in the universe. Yet, how the explosion occurs is not fully understood. For example, the debate around the models that describe burn rate and explosion mechanics is still not settled. In addition, the flame speed — that is the rate of expansion of a flame front in a combustion reaction — is one of the biggest unknowns in current models.

“Solving the flow problem in a whole supernova is still very far in the future,” said Livescu, “but accurately solving the turbulent flow in a small domain around a single flame, characterizing the early stages of the supernova, has become possible. The very high resolution reacting turbulence simulations enabled by Roadrunner can probe parameter values close to the detonation regime, where the flame becomes supersonic, and explore for the first time the turbulence properties under such complex conditions.”

Tags:

Scientists working on 10 research projects have been awarded precious computing time on JUGENE, one of the most powerful supercomputers in the world. The projects, which cover fields as diverse as astrophysics, earth sciences, engineering and physics, gained access to JUGENE thanks to the PRACE ('Partnership for advanced computing in Europe') project.

Scientists in varied disciplines require access to supercomputers to solve some of the most pressing issues facing society today. PRACE is meeting this challenge head on by establishing a high performance computing (HPC) research infrastructure in Europe. Its work is supported by the Research Infrastructures budget lines of the EU's Sixth and Seventh Framework Programmes (FP6 and FP7), and it has been identified as a priority infrastructure for Europe by ESFRI, the European Strategy Forum on Research Infrastructures.

JUGENE, which is hosted by Forschungszentrum Jülich in Germany, is the first supercomputer in the network and has the distinction of being Europe's fastest computer available for public research. Competition for access to this world-class facility is fierce; PRACE received 68 applications requesting a total of 1,870 million hours of computing time from this first call for proposals. The 10 winning projects, which are led by scientists in Germany, Italy, the Netherlands, Portugal and the UK, will share over 320 million core computing hours.

The successful projects were selected on the basis of the scientific and technical excellence, their clear need for access to a top supercomputer, and the fact that they will be able to achieve significant research results within their allotted time.

Jochen Blumberger of University College London (UCL) in the UK has been awarded 24.6 million core hours to investigate electron transport in organic solar cells. Organic solar cells are a promising alternative to silicon-based solar cells. In addition to being cheap and easy to produce, they are light and flexible, meaning they can easily be fitted to windows, walls and roofs. On the downside, they suffer from a low light-to-electricity conversion efficiency. One reason for their low efficiency involves the fate of the photogenerated electrons. Dr Blumberger's work on JUGENE will advance our understanding of the processes taking place in organic solar cells.

Another project in the energy field comes from Frank Jenko of the Max Planck Institute for Plasma Physics in Germany. His 50 million core hour project, which will shed new light on plasma turbulence, represents a contribution to the mega international fusion energy project ITER.

Another UCL researcher, Peter Coveney, will use his 17 million core hour time budget to study turbulent liquids. Predicting the properties of turbulent fluids is extremely challenging, and Professor Coveney's work could have implications for our understanding of weather forecasting, transport and the dispersion of pollutants, gas flows in engines and blood circulation.

Meanwhile Zoltán Fodor of the Bergische Universität Wuppertal in Germany has been awarded 63 million core hours to go back in time to the start of the universe, to a period when infinitesimally small particles, such as quarks and gluons, combined to form protons and neutrons which in turn came together to form atomic nuclei. The goal of Dr Fodor and his team is to analyse the properties of strongly interacting matter under 'extreme conditions'.

Atmospheric boundary layers are at the heart of the 35 million core hour project submitted by Harmen Jonker of Delft University in the Netherlands. Boundary layers change as a result of daytime heating and wind shear. Understanding them is crucial for the generation of accurate weather, climate and air quality models.

The other projects awarded access to JUGENE in this round of calls for proposals focus on molecular dynamics, magnetic reconnection, the deformation of metals, supernovae and quarks.
Tags: