President Barack Obama today announced his intent to nominate physicist Patrick Gallagher to be the 14th director of the U.S. Commerce Department’s National Institute of Standards and Technology (NIST). Gallagher, 46, is currently the NIST deputy director.
 
“NIST is a unique agency with a strong culture of world-class scientific achievement,” U.S. Commerce Secretary Gary Locke said. “Pat Gallagher has come up through the ranks and his continued leadership will be critical to an agency that is central to the nation’s ability to innovate and compete in global markets.” 
 
If confirmed by the Senate, Gallagher will direct an agency with an annual budget of approximately $800 million that employs approximately 2,900 scientists, engineers, technicians, support staff and administrative personnel at two primary locations: Gaithersburg, Md., and Boulder, Colo. Gallagher will succeed William Jeffrey, who left NIST in 2007.
 
Though perhaps most widely known as the civilian provider of the nation’s standard time service, NIST also conducts research in measurement science, standards, and related technologies spanning all physical sciences, engineering and information technology. 
 
The agency also is home to the Hollings Manufacturing Extension Partnership, a nationwide network of local centers offering technical and business assistance to smaller manufacturers; the Technology Innovation Program, which provides cost-shared awards to industry, universities and consortia for research on potentially revolutionary technologies that address critical national and societal needs; and the Baldrige National Quality Program, which promotes performance excellence among U.S. manufacturers, service companies, educational institutions, health care providers and nonprofit organizations.
 
Gallagher, who has a doctorate in physics from the University of Pittsburgh, came to the NIST Center for Neutron Research (NCNR) in 1993 to pursue research in neutron and X-ray instrumentation and studies of soft-condensed matter systems such as liquids, polymers and gels. 
 
In 2000, Gallagher was a NIST agency representative at the National Science and Technology Council (NSTC) and became active in U.S. policy for scientific user facilities. In 2006, he was awarded a Department of Commerce Gold Medal, the department’s highest award, in recognition of this work. In 2004, he became director of the NCNR, a national user facility for neutron research that is considered one of the most productive and heavily used facilities of its type in the nation. In September 2008, he was appointed deputy director of NIST.
 
Gallagher is active in a variety of professional organizations and is a member of the American Association for the Advancement of Science.
 
Founded in 1901, NIST is a nonregulatory agency of the Commerce Department that promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life.

The Premier of Victoria John Brumby today announced a significant collaboration with IBM to build the world’s most powerful supercomputer dedicated to life sciences research in Melbourne.

Mr Brumby said the supercomputer, to be based at Melbourne University in Parkville would further boost Victoria’s reputation as a global centre for excellence in life sciences research.

“The Victorian Life Sciences Computational Initiative (VLSCI) will provide Victoria’s researchers with the necessary tools to solve some of the biggest challenges facing our health system and impacting our quality of life,” Mr Brumby said.

“The Victorian Government is taking action to support our world-class researchers and to invest in innovative projects that secure the state’s economy.

“That is why we have contributed $50 million towards the $100 million VLCSI with the University of Melbourne and IBM.

“The University of Melbourne’s supercomputer partnership with IBM will enable researchers to process genes to identify risk of cancer and treatment, model brain functions to treat brain disorders and disease, and model and predict the threats of infectious disease.

“The project will also create 30 new high-value jobs in Parkville.”

The supercomputer will be established in stages, with the aim of building to a system of over 800 Teraflops by 2012 – one Teraflop capacity enables a computer to make one trillion calculations per second.

Mr Brumby said that by today’s standards the supercomputer would rank in the top six supercomputers world wide.

“It will be more powerful than the supercomputer currently used by NASA in California,” he said.

Vice President of IBM Research Tilak Agerwala said as the largest IBM collaboration in life science, the VLSC holds great potential for driving new breakthroughs in the understanding of human disease and translating that knowledge into improved medical care.  

“It gives IBM Research the opportunity to expand the impact of our Computational Biology Center,” Mr Agerwala said.

University of Melbourne Vice-Chancellor Professor Glyn Davis said the University’s link with IBM in the partnership would further raise Victoria and Australia’s profile on the international map as a life sciences precinct equal to the best in the world.

“The outcome of this partnership will strengthen the research capabilities of Victoria’s life sciences researchers and expand of their capacity to carry out world-class life sciences research right here in Melbourne,” Professor Davis said.

For information about the partnership between UoM and IBM visit http://www.ibm.com/research

Narrower constraints from the newest analysis aren’t quite narrow enough

 The international Supernova Cosmology Project (SCP), based at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, has announced the Union2 compilation of hundreds of Type Ia supernovae, the largest collection ever of high-quality data from numerous surveys. Analysis of the new compilation significantly narrows the possible values that dark energy might take—but not enough to decide among fundamentally different theories of its nature.

“We’ve used the world’s best-yet dataset of Type Ia supernovae to determine the world’s best-yet constraints on dark energy,” says Saul Perlmutter, leader of the SCP. “We’ve tightened in on dark energy out to redshifts of one”—when the universe was only about six billion years old, less than half its present age—“but while at lower redshifts the values are perfectly consistent with a cosmological constant, the most important questions remain.”

That’s because possible values of dark energy from supernovae data become increasingly uncertain at redshifts greater than one-half, the range where dark energy’s effects on the expansion of the universe are most apparent as we look farther back in time. Says Perlmutter of the widening error bars at higher redshifts, “Right now, you could drive a truck through them.”

As its name implies, the cosmological constant fills space with constant pressure, counteracting the mutual gravitational attraction of all the matter in the universe; it is often identified with the energy of the vacuum. If indeed dark energy turns out to be the cosmological constant, however, even more questions will arise.

“There is a huge discrepancy between the theoretical prediction for vacuum energy and what we measure as dark energy,” says Rahman Amanullah, who led SCP’s Union2 analysis; Amanullah is presently with the Oskar Klein Center at Stockholm University and was a postdoctoral fellow in Berkeley Lab’s Physics Division from 2006 to 2008. “If it turns out in the future that dark energy is consistent with a cosmological constant also at early times of the universe, it will be an enormous challenge to explain this at a fundamental theoretical level.”

A major group of competing theories posit a dynamical form of dark energy that varies in time. Choosing among theories means comparing what they predict about the dark energy equation of state, a value written w. While the new analysis has detected no change in w, there is much room for possibly significant differences in w with increasing redshift (written z).

“Most dark-energy theories are not far from the cosmological constant at z less than one,” Perlmutter says. “We’re looking for deviations in w at high z, but there the values are very poorly constrained.”

In their new analysis to be published in the Astrophysical Journal, “Spectra and HST light curves of six Type Ia supernovae at 0.511 < z < 1.12 and the Union2 compilation,” the Supernova Cosmology Project reports on the addition of several well-measured, very distant supernovae to the Union2 compilation. The paper is now available online at http://arxiv4.library.cornell.edu/abs/1004.1711.

Dark energy fills the universe, but what is it?

Dark energy was discovered in the late 1990s by the Supernova Cosmology Project and the competing High-Z Supernova Search Team, both using distant Type Ia supernovae as “standard candles” to measure the expansion history of the universe. To their surprise, both teams found that expansion is not slowing due to gravity but accelerating.

Other methods for measuring the history of cosmic expansion have been developed, including baryon acoustic oscillation and weak gravitational lensing, but supernovae remain the most advanced technique. Indeed, in the years since dark energy was discovered using only a few dozen Type Ia supernovae, many new searches have been mounted with ground-based telescopes and the Hubble Space Telescope; many hundreds of Type Ia’s have been discovered; techniques for measuring and comparing them have continually improved.

In 2008 the SCP, led by the work of team member Marek Kowalski of the Humboldt University of Berlin, created a way to cross-correlate and analyze datasets from different surveys made with different instruments, resulting in the SCP’s first Union compilation. In 2009 a number of new surveys were added.

The inclusion of six new high-redshift supernovae found by the SCP in 2001, including two with z greater than one, is the first in a series of very high-redshift additions to the Union2 compilation now being announced, and brings the current number of supernovae in the whole compilation to 557.

“Even with the world’s premier astronomical observatories, obtaining good quality, time-critical data of supernovae that are beyond a redshift of one is a difficult task,” says SCP member Chris Lidman of the Anglo-Australian Observatory near Sydney, a major contributor to the analysis. “It requires close collaboration between astronomers who are spread over several continents and several time zones. Good team work is essential.”

Union2 has not only added many new supernovae to the Union compilation but has refined the methods of analysis and in some cases improved the observations. The latest high-z supernovae in Union2 include the most distant supernovae for which ground-based near-infrared observations are available, a valuable opportunity to compare ground-based and Hubble Space Telescope observations of very distant supernovae.

Type Ia supernovae are the best standard candles ever found for measuring cosmic distances because the great majority are so bright and so similar in brightness. Light-curve fitting is the basic method for standardizing what variations in brightness remain: supernova light curves (their rising and falling brightness over time) are compared and uniformly adjusted to yield comparative intrinsic brightness. The light curves of all the hundreds of supernova in the Union2 collection have been consistently reanalyzed.

The upshot of these efforts is improved handling of systematic errors and improved constraints on the value of the dark energy equation of state with increasing redshift, although with greater uncertainty at very high redshifts. When combined with data from cosmic microwave background and baryon oscillation surveys, the “best fit cosmology” remains the so-called Lambda Cold Dark Matter model, or ΛCDM.

ΛCDM has become the standard model of our universe, which began with a big bang, underwent a brief period of inflation, and has continued to expand, although at first retarded by the mutual gravitational attraction of matter. As matter spread and grew less dense, dark energy overcame gravity, and expansion has been accelerating ever since.

To learn just what dark energy is, however, will first require scientists to capture many more supernovae at high redshifts and thoroughly study their light curves and spectra. This can’t be done with telescopes on the ground or even by heavily subscribed space telescopes. Learning the nature of what makes up three-quarters of the density of our universe will require a dedicated observatory in space.

Page 4 of 17