Conventional memories used in today's supercomputers only differentiate between the bit values 0 and 1. In quantum physics, however, arbitrary superpositions of these two states are possible. Most of the ideas for new quantum technology devices rely on this "Superposition Principle." One of the main challenges in using such states is that they are usually short-lived. Only for a short period of time can information be read out of quantum memories reliably, after that it is irrecoverable.

A research team at TU Wien has now taken an important step forward in the development of new quantum storage concepts. In cooperation with the Japanese telecommunication giant NTT, the Viennese researchers lead by Johannes Majer are working on quantum memories based on nitrogen atoms and microwaves. The nitrogen atoms have slightly different properties, which quickly leads to the loss of the quantum state. By specifically changing a small portion of the atoms, one can bring the remaining atoms into a new quantum state, with a lifetime enhancement of more than a factor of ten. These results have now been published in the journal Nature Photonics.

Nitrogen in diamond

"We use synthetic diamonds in which individual nitrogen atoms are implanted", explains project leader Johannes Majer from the Institute of Atomic and Subatomic Physics of TU Wien. "The quantum state of these nitrogen atoms is coupled with microwaves, resulting in a quantum system in which we store and read information."

However, the storage time in these systems is limited due to the inhomogeneous broadening of the microwave transition in the nitrogen atoms of the diamond crystal. After about half a microsecond, the quantum state can no longer be reliably read out, the actual signal is lost. Johannes Majer and his team used a concept known as "spectral hole burning", allowing data to be stored in the optical range of inhomogeneously broadened media, and adapted it for supra-conducting quantum circuits and spin quantum memories.

Dmitry Krimer, Benedikt Hartl and Stefan Rotter (Institute of Theoretical Physics, TU Wien) have shown in their theoretical work that such states, which are largely decoupled from the disturbing noise, also exist in these systems. "The trick is to manoeuver the quantum system into these durable states through specific manipulation, with the aim to store information there," explains Dmitry Krimer.

Excluding specific energies

"The transitions areas in the nitrogen atoms have slightly different energy levels because of the local properties of the not quite perfect diamond crystal", explains Stefan Putz, the first author of the study, who has since moved from TU Wien to Princeton University. "If you use microwaves to selectively change a few nitrogen atoms that have very specific energies, you can create a "Spectral Hole". The remaining nitrogen atoms can then be brought into a new quantum state, a so-called "dark state", in the center of these holes. This state is much more stable and opens up completely new possibilities."

"Our work is a 'proof of principle' - we present a new concept, show that it works, and we want to lay the foundations for further exploration of innovative operational protocols of quantum data," says Stefan Putz.

With this new method, the lifetime of quantum states of the coupled system of microwaves and nitrogen atoms increased by more than one order of magnitude to about five microseconds. This is still not a great deal in the standard of everyday life, but in this case it is sufficient for important quantum-technological applications. "The advantage of our system is that one can write and read quantum information within nanoseconds," explains Johannes Majer. "A large number of working steps are therefore possible in microseconds, in which the system remains stable." 

CAPTION An artificial diamond under the optical microscope. The diamond fluoresces because due to a number of nitrogen defects.
CAPTION An artificial diamond under the optical microscope. The diamond fluoresces because due to a number of nitrogen defects.

SuperComputer algorithms can automatically interpret echocardiographic images and distinguish between pathological hypertrophic cardiomyopathy (HCM) and physiological changes in athletes' hearts, according to research from the Icahn School of Medicine at Mount Sinai (ISMMS), published online yesterday in the Journal of the American College of Cardiology. 

HCM is a disease in which a portion of the myocardium enlarges, creating functional impairment of the heart. It is the leading cause of sudden death in young athletes. Diagnosing HCM is challenging since athletes can present with physiological hypertrophy, in which their hearts appear large, but do not feature the pathological abnormality of HCM. The current standard of care requires precise phenotyping of the two similar conditions by a highly trained cardiologist.

"Our research has demonstrated for the first time that machine-learning algorithms can assist in the discrimination of physiological versus pathological hypertrophic remodeling, thus enabling easier and more accurate diagnoses of HCM," said senior study author Partho P. Sengupta, MD, Director of Cardiac Ultrasound Research and Professor of Medicine in Cardiology at the Icahn School of Medicine at Mount Sinai. "This is a major milestone for echocardiography, and represents a critical step toward the development of a real-time, machine-learning-based system for automated interpretation of echocardiographic images. This could help novice echo readers with limited experience, making the diagnosis rapid and more widely available." 

Using data from an existing cohort of 139 male subjects who underwent echocardiographic imaging at ISMMS (77 verified athlete cases and 62 verified HCM cases), the researchers analyzed the images with tissue tracking software and identified variable sets to incorporate in the machine-learning models. They then developed a collective machine-learning model with three different algorithms to differentiate the two conditions. The model demonstrated superior diagnostic ability comparable to conventional 2D echocardiographic and Doppler-derived parameters used in clinical practice.

"Our approach shows a promising trend in using automated algorithms as precision medicine techniques to augment physician-guided diagnosis," said study author Joel Dudley, PhD, Director of the Institute for Next Generation Healthcare and Director of the Center for Biomedical Informatics at ISMMS. "This demonstrates how machine-learning models and other smart interpretation systems could help to efficiently analyze and process large volumes of cardiac ultrasound data, and with the growth of telemedicine, it could enable cardiac diagnoses even in the most resource-burdened areas."

Powerful new model indicates that current pollution standards may be inadequate to ward off worsening algae blooms

New research suggests that Lake Champlain may be more susceptible to damage from climate change than was previously understood--and that, therefore, the rules created by the EPA to protect the lake may be inadequate to prevent algae blooms and water quality problems as the region gets hotter and wetter.

"This paper provides very clear evidence that the lake could be far more sensitive to climate change than is captured by the current approach of the EPA," said University of Vermont professor Asim Zia, the lead author of the new study. "We may need more interventions--and this may have national significance for how the agency creates regulations."

The research was published November 17 in the journal Environmental Research Letters.

MORE THAN MODEST

The study, led by a team of ten scientists from UVM and one from Dartmouth College, used a powerful set of supercomputer models that link the behavior of social and ecological systems. Their results show that accelerating climate change could easily outpace the EPA's land-use management policies aimed at reducing the inflow of pollution from agricultural runoff, parking lots, deforestation, cow manure, lawn fertilizer, pet waste, streambank erosion--and other sources of excess phosphorus that cause toxic algae and lake health problems.

The EPA's modeling to prepare its rules under what's called the TMDL, for "total maximum daily load," concluded that "any increases in the phosphorus loads to the lake due to the climate change are likely to be modest (i.e. 15%)," the agency writes. But the eleven scientists, within the Vermont EPSCoR program at UVM, who led the new modeling were concerned that this approach might underestimate the range of likely outcomes in a warmer future.

UVM professor Chris Koliba, a co-author and social scientist on the new study observed that, "there have been extensive efforts by federal regulators, the State of Vermont, and many other stakeholders to try to remediate and improve water quality in our watersheds. These should be honored. The message of our research is not to demean that work, but to say that in the long run protecting the lake is going to take a lot more than what's being proposed right now."

LIMITED OPTIONS

The new lake model, with support from the National Science Foundation, integrates a much larger assembly of possible global climate change models and greenhouse gas pathways than the current TMDL approach used in its modeling. And the Vermont scientists delved deeply into the indirect and interactive effects of land use changes, "legacy phosphorus" that's been piling up for decades in the sediment at the bottom of the lake, and other factors. From this, they created a set of forecasts for what might happen to Lake Champlain over the next few decades out to 2040--including changes in water quality, temperature, and the severity of algae blooms. Their result: a much more dramatic range of possible outcomes--and greater uncertainty--than those assumed in the EPA's approach.

In several of the plausible hotter and wetter scenarios that the model considers, a cascading set of problems could lead to phosphorous pollution levels in segments of Lake Champlain that "drastically limit land management options to maintain water quality," the team wrote--especially in shallow bays like Missisquoi Bay that was the focus of the new study. In the long run, the risk of underestimating the impacts of climate change could lead to what the scientists call "intractable eutrophic conditions"--a permanent change in the lake that leads to self-perpetuating algae blooms, lost fisheries, and poor water quality.

NEW TOOL

The new integrated assessment model created by the NSF-funded team under the science leadership of Asim Zia provides a powerful tool that goes far beyond understanding Lake Champlain.

By connecting sub-models--of human behavior and land use, watershed dynamics, global climate models "downscaled" to the local region, and the hydrology of the lake itself--the overall model links together "the behavior of the watershed, lake, people and climate," said Judith Van Houten, UVM professor of biology, director of Vermont EPSCoR, and co-author on the new study. This provides "a way forward to pull back the veil that often surrounds effects of climate change," she says.

"Integrating these models is an enormous achievement that will be exportable across the US and be of practical use to many states and countries as they try to develop policies in the face of climate change," she said. It can allow lake and land managers to test scenarios that draw in a huge range of time scales and types of interactions, ranging from water chemistry to air temperature to land use policies.

Only by solving this kind of model-of-many-models problem, "as we have done," Van Houten said, could a tool be created that has predictive power for decades ahead, "allowing stakeholders to test their ideas," she says, and even "describing the health of the lake out to the turn of the century."

UVM hydrologist Arne Bomblies, a co-author on the study, noted that, "We show through this modeling work the importance of a more comprehensive consideration of climate change impact mechanisms to achieve water quality goals, and the need to adequately address climate change uncertainty."

"Lake Champlain's future is sensitive to climate change," Bomblies said, "and similar challenges are faced by other impaired waters throughout the United States." CAPTION What future for Lake Champlain? A powerful new model from a team of scientists at the University of Vermont suggests that climate change may pose greater risks to the health of the lake than previously realized. The results may have implications for how the EPA and others manage and regulate not just this international lake but other freshwater lakes across the nation. CREDIT Joshua Brown/UVM

CAPTION Air and water interactions are a key component of Great Lakes weather and climate. A new supercomputer model better connects these processes to create more accurate forecasts. CREDIT Michigan Tech, Sarah Bird

Up until now, atmospheric models and hydrodynamic models have remained separate to a large extent in the Great Lakes region, with only a few attempts to loosely couple them. In a new study, published online this week in the Journal of Climate, an integrated model brings together climate and water models. 

The collaborative work is the product of researchers from Michigan Technological University, Loyola Marymount University, LimnoTech and the National Oceanic and Atmospheric Administration's Great Lakes Environmental Research Laboratory. Pengfei Xue, an assistant professor of civil and environmental engineering at Michigan Tech, led the study through his work at the Great Lakes Research Center on campus. 

"One of the important concepts in climate change, in addition to knowing the warming trend, is understanding that extreme events become more severe," Xue says. "That is both a challenge and an important focus in regional climate modeling."

To make those connections, the model specifically uses two-way coupling and 3-dimensional modeling to connect atmospheric and lake body interactions. Two-way coupling is like a two-way street and enables feedback between variables; other models use preset inputs that act more like one-way streets. Current models also rely on 1-D lake models that cannot account for the dynamic nature of hydrologic processes in bodies of water as large as the Great Lakes. 

For comparison, most widely used global climate models use only tens of grid points (roughly 0.5 degree resolution) to cover all of the Great Lakes, if they account for the lakes at all. To create a more nuanced view, like what has been accomplished already in ocean coastline modeling, the new model simulates the hydrodynamics of the Great Lakes region with 3-D hydrodynamic model constructed of 40 vertical layers and 2-kilometer horizontal grid resolution. That's roughly 50,000 grids for each layer, which enables feedback between air and water data.

The datasets used are so large that they can only run on a supercomputer. Xue uses the Superior supercomputer at the Great Lakes Research Center. Xue and his team vetted the model's accuracy by comparing its simulations to historical records and satellite data.

"This kind of approach has been recognized as a critical step in the Great Lakes region that has been building over the past decade," Xue says. 

The next stage of the research will expand the model to include surface water runoff. Refining the model is a community effort, and the team plans to work with current collaborators to apply and test the limits of the model. 

In its current version, the new model provides better footing to further Great Lakes research. By doing so, scientists will glean more information about everything from regional climate change and shipping to oil spill mitigation and invasive species.

Researchers from David Karl's laboratory at the University of Hawai'i at Mānoa (UHM) and from Professor Jens Nielsen's laboratory at Chalmers University of Technology in Göteborg, Sweden, developed a supercomputer model which takes into account hundreds of genes, chemical reactions, and compounds required for the survival of Prochlorococcus, the most abundant photosynthetic microbe on the planet. They found that Prochlorococcus has made extensive alterations to its metabolism as a way to reduce its dependence on phosphorus, an element that is essential and often growth-limiting in the ocean.

Revolutionary developments in gene sequencing technology have allowed scientists to catalog and investigate the genetic diversity and metabolic capability of life on Earth--from E. coli bacteria to humans, and much in between. Ocean monitoring and advances in oceanographic sensors have enabled a more detailed look than ever before at the environmental conditions that are both the consequence of microbial activity and act as stressors on the growth of microbes in the ocean. 

This new metabolic model represents a window to the inner workings that enable microbes to dominate Earth's chemical and biological cycles, thrive in the harshest conditions, and make the planet habitable--a black box, in a sense. 

Microbes are known to employ three basic strategies to compete for limiting elemental resources: cell quotas may be adjusted, stressed cells may synthesize molecules to make more efficient use of available resources, and cells may access alternatives or more costly sources of the nutrient. 

In the case of phosphorus, a limiting resource in vast oceanic regions, the cosmopolitan Prochlorococcus thrives by adopting all three strategies and a fourth, previously unknown strategy. 

"By generating the first detailed model of metabolism for an ecologically important marine microbe, we found that Prochlorococcus has evolved a way to reduce its dependence on phosphate by minimizing the number of enzymes involved in phosphate transformations, thus relieving intracellular demands" said John Casey, an oceanography doctoral candidate in the UHM School of Ocean and Earth Science and Technology and lead author of the recently published study. 

Prochlorococcus has an extremely minimal genome. If it were to lose the function of any one metabolic gene, its survival would be nearly a coin toss. To their surprise, Casey and co-authors discovered that the world's most abundant microbe has performed, through a process called "genome streamlining"--the concerted loss of frivolous genes over evolutionary time--a comprehensive re-design of the core metabolic pathways in response to the persistent limitation of phosphorus. 

"The dramatic and widespread change in the metabolic network is really a shock," said Casey. "However, we're seeing that these changes provide a substantial growth advantage for this ubiquitous microbe in phosphorus-limited regions of the ocean, so it seems that where there's a will there's a way."

The computer model is built from an enormous library of genetic data compiled from researchers around the world, and the results are validated with data from numerous laboratory culture experiments and field studies. 

"We're interested in the underlying principles guiding metabolism and physiology in marine microbes, and that is going to require a deep understanding of not only the 1-dimensional genetic code, but also the 4-dimensional product it codes for," said Casey. "So we're looking to a systems-level approach to incorporate a great variety of physiological and 'omics studies all in one computational structure, with the hope that we can start to learn from the design and interactions of these complex systems."

In the future, the researchers plan to expand the model to include more representatives of the marine microbial community and to look deeper into micro-diversity within the Prochloroccocus genus. 

"This will allow us to simulate marine microbial community metabolism at an unprecedented level of detail; embedding these fine-scale simulations within global ocean circulation models promises to deliver insights into how microbial assemblages interact with their environment and amongst each other," said Casey.

 CAPTION Prochlorococcus, the most abundant photosynthetic microbe on the planet, is found in the Pacific Ocean (shown) and around the globe. CREDIT Tara Clemente, University of Hawaii SOEST

Page 4 of 392