After running a series of complex supercomputer simulations, researchers have found that flaws in the structure of magnetic nanoscale wires play an important role in determining the operating speed of novel devices using such nanowires to store and process information. The finding*, made by researchers from the National Institute of Standards and Technology (NIST), the University of Maryland, and the University of Paris XI, will help to deepen the physical understanding and guide the interpretation of future experiments of these next-generation devices.

Magnetic nanowires store information in discrete bands of magnetic spins. One can imagine the nanowire like a straw sucking up and holding the liquid of a meticulously layered chocolate and vanilla milkshake, with the chocolate segments representing 1s and the vanilla 0s. The boundaries between these layers are called domain walls. Researchers manipulate the information stored on the nanowire using an electrical current to push the domain walls, and the information they enclose, through the wire and past immobile read and write heads.

Interpretations of experiments seeking to measure how domain walls move have largely ignored the effects of "disorder"—usually the result of defects or impurities in the structure of the nanowires. To see how disorder affects the motion of these microscopic magnetic domains, NIST researchers and their colleagues introduced disorder into their computer simulations.

Their simulations showed that disorder, which causes friction within the nanowires, can increase the rate at which a current can move domain walls.

According to NIST physicist Mark Stiles, friction can cause the domain walls to move faster because they need to lose energy in order to move down the wire.

For example, when a gyroscope spins, it resists the force of gravity. If a little friction is introduced into the gyroscope's bearing, the gyroscope will fall over more quickly. Similarly, in the absence of damping, a domain wall will only move from one side of the nanowire to the other. Disorder within the nanowire enables the domain walls to lose energy, which gives them the freedom to "fall" down the length of the wire as they move back and forth.

"We can say that the domain wall is moving as if it were in a system that has considerably greater effective damping than the actual damping," says NIST physicist and lead researcher Hongki Min. "This increase in the effective damping is significant enough that it should affect the interpretation of most future domain wall experiments."

Asthma, Cancer, Weather Disaster-Related Illnesses Cited Among Concerns

The vulnerability of people to the health effects of climate change is the focus of a report released today by an NIH-led federal interagency group that includes NOAA. The report, “A Human Health Perspective on Climate Change,” calls for coordinating federal research to better understand climate’s impact on human health and identifying how these impacts can be most effectively addressed. The report was published by Environmental Health Perspectives and the National Institute of Environmental Health Sciences. 

The report indicates what is known and the significant knowledge gaps in our understanding of the consequences of climate change on 11 major illness categories, including cancer, cardiovascular disease and stroke, asthma and other respiratory disorders, food-borne diseases and nutrition, weather and heat-related fatalities, and water and vector-borne infectious diseases.  

 “To mitigate and adapt to the health effects of climate change, we must first understand them. This report is a vital new roadmap for doing that,” said Jane Lubchenco, Ph.D., under secretary of commerce for oceans and atmosphere and NOAA administrator. “There is an urgent need to get started, and I am pleased that we can bring NOAA climate science and NOAA capabilities in linking ocean and human health and a range of other monitoring and prediction tools to the table.”    

 Health experts from the National Institute of Environmental Health Sciences, the U.S. Environmental Protection Agency, the Centers for Disease Control and Prevention, the U.S. Department of Agriculture and NOAA contributed to the new report. Research recommendations include examining how diseases in marine mammals might be linked to human health; investigating how climate change might contaminate seafood, beaches and drinking water; and understanding the impact of atmospheric changes on heat waves and air-borne diseases. There are questions about the effects of increased rainfall and extreme weather events on sewage discharges and run-off and what this will mean to human health. Integrating human, terrestrial and aquatic animal health surveillance with environmental monitoring is recommended to better understand emerging health risks like Lyme disease, West Nile virus, malaria, and toxins from marine algae.  

 To address disaster planning and management, the report encourages research aimed at strengthening healthcare and emergency services, especially when events such as floods, drought and wildfires can affect human health both during and after an event. The report also identifies the need for more effective early warning systems providing, for example, an alert to those with cardiovascular disease on extreme heat days or when air pollution is high. Other issues include susceptible and displaced populations; public health and health care infrastructure; essential capacities and skills, particularly for modeling and prediction; the integration of climate observation networks with health impact and surveillance tools, and communication and education.

NOAA understands and predicts changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and conserves and manages our coastal and marine resources. Visit us at http://www.noaa.gov

Premier John Brumby has announced funding for two high performance computers at the Australian Synchrotron at Monash University.

Mr Brumby said the Victorian Government had invested almost $4 billion to strengthening technology and innovation in Victoria, building modern, cutting edge infrastructure which is attracting the best researchers and innovators to Victoria.

The $8.3 million High Performance Computing (HPC) facility will take Victoria’s use of computational imaging to the next level.

“I am pleased to announce further funding for the newest, high tech research infrastructure in Victoria - $800,000 towards the new MASSIVE high Performance Computing facility,” he said.

“The Multi-modal Australian Sciences Imaging and Visualisation Environment - or MASSIVE - is the first facility of its kind in Australia, bringing together two high performance computers at the Australian Synchrotron.

“MASSIVE will be a centre of excellence for computational imaging and visualisation and offer researchers from a range of fields — including biomedicine, astronomy, engineering, geoscience and climate studies — unparalleled capacities to construct and view visualisations of the objects of their investigations.

“The facility will enable scientists to create, view and analyse high-resolution scientific images and 3D-models previously too large to visualise.”

Mr Brumby said the Victorian Government was committed to world-class innovative infrastructure that drives Victoria’s economy.

“We are committed to investing in Victoria’s supercomputers, collaboration tools and advanced gigabit networks to enhance the accessibility of costly scientific instruments — such as synchrotrons, gene sequencers, telescopes and sensor networks,” Mr Brumby said.

“By investing in these facilities we are boosting the State’s capacity to turn new ideas and technologies into valued products, services and solutions.”

MASSIVE is a partnership between some of the country's leading technology providers and research institutions — including the Australian Synchrotron, CSIRO, Victorian Partnership for Advanced Computing and Monash University.

The Premier of Victoria John Brumby today announced a significant collaboration with IBM to build the world’s most powerful supercomputer dedicated to life sciences research in Melbourne.

Mr Brumby said the supercomputer, to be based at Melbourne University in Parkville would further boost Victoria’s reputation as a global centre for excellence in life sciences research.

“The Victorian Life Sciences Computational Initiative (VLSCI) will provide Victoria’s researchers with the necessary tools to solve some of the biggest challenges facing our health system and impacting our quality of life,” Mr Brumby said.

“The Victorian Government is taking action to support our world-class researchers and to invest in innovative projects that secure the state’s economy.

“That is why we have contributed $50 million towards the $100 million VLCSI with the University of Melbourne and IBM.

“The University of Melbourne’s supercomputer partnership with IBM will enable researchers to process genes to identify risk of cancer and treatment, model brain functions to treat brain disorders and disease, and model and predict the threats of infectious disease.

“The project will also create 30 new high-value jobs in Parkville.”

The supercomputer will be established in stages, with the aim of building to a system of over 800 Teraflops by 2012 – one Teraflop capacity enables a computer to make one trillion calculations per second.

Mr Brumby said that by today’s standards the supercomputer would rank in the top six supercomputers world wide.

“It will be more powerful than the supercomputer currently used by NASA in California,” he said.

Vice President of IBM Research Tilak Agerwala said as the largest IBM collaboration in life science, the VLSC holds great potential for driving new breakthroughs in the understanding of human disease and translating that knowledge into improved medical care.  

“It gives IBM Research the opportunity to expand the impact of our Computational Biology Center,” Mr Agerwala said.

University of Melbourne Vice-Chancellor Professor Glyn Davis said the University’s link with IBM in the partnership would further raise Victoria and Australia’s profile on the international map as a life sciences precinct equal to the best in the world.

“The outcome of this partnership will strengthen the research capabilities of Victoria’s life sciences researchers and expand of their capacity to carry out world-class life sciences research right here in Melbourne,” Professor Davis said.

For information about the partnership between UoM and IBM visit http://www.ibm.com/research

Nautilus, the powerful computer for visualizing and analyzing large datasets at the Remote Data Analysis and Visualization Center (RDAV), goes into full production on Sept. 20. Managed by UT Knoxville and funded by a grant from the National Science Foundation (NSF), RDAV and Nautilus provide scientists with cutting-edge capabilities for analyzing and visualizing massive quantities of data. As Nautilus goes into service, RDAV will serve researchers in a wide range of fields.

“The National Institute for Mathematical and Biological Synthesis (NIMBioS) and RDAV have already initiated new collaborations to enhance the use of visualization for large datasets arising from field observations and from mathematical model results,” said Louis Gross, professor of ecology, and evolutionary biology and mathematics at UT Knoxville and director of NIMBioS. “Our objective is to increase the ability of biologists to interpret and analyze complex, multivariate data to address fundamental and applied questions in the life sciences.”

“For a scientist, visualization is more than just generating pretty pictures,” said astrophysicist Bronson Messer of the Oak Ridge National Lab (ORNL) and UT Knoxville. “As our simulations grow larger and larger, visualization and the associated data analysis are absolutely essential in producing scientific insight from computation. Nautilus, and the way it is being integrated into the computational ecosystem at the National Institute for Computational Sciences, looks like a very promising avenue for us to increase the amount of scientific insight we obtain from simulations on Kraken.”

In addition to addressing scientific problems in the life sciences and astrophysics, Nautilus will be used to process data spanning many other research areas. These include visualizing data results from computer simulations with many complex variables, such as weather or climate modeling; analyzing large amounts of data coming from experimental facilities like ORNL’s Spallation Neutron Source; and aggregating and interpreting input from a large number of sensors distributed over a wide geographic region. The computer will also have the capability to study large bodies of text and aggregations of documents.

With 1,024 cores and four terabytes of memory, Nautilus can process large volumes of data and analyze it in ways unlike those used by previous systems. Manufactured by SGI as part of their UltraViolet product line, Nautilus features a shared-memory architecture with a single system image. This configuration allows researchers great flexibility in taking advantage of computing power to analyze larger amounts of data in ways that are impossible on many other high-performance computing systems. RDAV has installed a 1.1 petabyte filesystem on Nautilus in anticipation of these vast amounts of data.

Founded in the second half of 2009, the RDAV center is now fully staffed and ready to support scientists and their data and visualization challenges. In addition to configuring and fine-tuning the Nautilus hardware, the team has been developing and refining new and existing software to address the latest scientific problems and to engage the scientific community.

“I’m very excited about standing up this machine for the NSF TeraGrid, as it’s going to provide much-needed capability for understanding complex datasets from Kraken and other sources,” said RDAV Director Sean Ahern. “And as datasets continue to grow, the shared memory nature of the SGI is a fertile ground for new analysis and visualization research for very large datasets.”

RDAV is a partnership between UT Knoxville, Lawrence Berkeley National Laboratory, the University of Wisconsin, the National Center for Supercomputing Applications at the University of Illinois and ORNL. RDAV is funded by NSF through the American Recovery and Reinvestment Act of 2009 and is part of the TeraGrid eXtreme Digital Resources for Science and Engineering.

For more information, visit http://rdav.nics.tennessee.edu/.

Page 7 of 22