Voltaire has announced that it continues to grow its adVantage Partner Program, which now includes more than fifty partner organizations consisting of resellers, VARs and distributors. The program provides comprehensive sales, marketing, technical training and support services to help partners worldwide sell Voltaire's industry-leading portfolio of 10 Gigabit Ethernet (10GbE) and InfiniBand datacenter solutions.

New members of the program include Dasher Technologies and Torrey Point which recommend Voltaire products to their customers because of their differentiated and solutions-oriented approach to networking. In addition to 10GbE and InfiniBand switching platforms, the Voltaire portfolio includes unique software offerings to address key networking requirements such as: managing networks in the age of cloud computing and virtualized datacenters, accelerating access to storage, and reducing latency for applications such as high frequency trading.

"Our strong technical expertise and vendor independence allow us to integrate best-of-breed software, hardware and services into a custom solution that directly impacts the business, said Al Chien, Vice President of Sales and Marketing, Dasher Technologies. "Voltaire's approach to 10GbE networks includes a software management layer in addition to the switches that is unique in the industry and serves as the foundation for cloud networking. We are pleased to be part of the adVantage program and to offer Voltaire solutions to our customers."

"Torrey Point designs networks with a focus on maximizing the return on technology investment for our worldwide clientele," said Steve Fazio, chief executive officer, Torrey Point. "Voltaire's 10GbE solutions meet the new infrastructure requirements for datacenter and cloud computing environments of all sizes. We are pleased to add Voltaire products to our offering."

"Growing the adVantage Partner Program to more than fifty channel partners worldwide represents a significant milestone for Voltaire's continued expansion into the 10GbE market," said Patrick Guay, executive vice president of global sales and general manager at Voltaire. "We are continually expanding our product set while providing our partners with more technical and sales content via our partner portal to enhance the program. The adVantage program provides our partners with the fundamental tools to help differentiate their business and to deliver an outstanding customer experience. The formula is simple: provide leading-edge products that enable unique solutions -- at an extremely competitive price point, with up-to-date technical training, and outstanding support."

After running a series of complex supercomputer simulations, researchers have found that flaws in the structure of magnetic nanoscale wires play an important role in determining the operating speed of novel devices using such nanowires to store and process information. The finding*, made by researchers from the National Institute of Standards and Technology (NIST), the University of Maryland, and the University of Paris XI, will help to deepen the physical understanding and guide the interpretation of future experiments of these next-generation devices.

 

Magnetic nanowires store information in discrete bands of magnetic spins. One can imagine the nanowire like a straw sucking up and holding the liquid of a meticulously layered chocolate and vanilla milkshake, with the chocolate segments representing 1s and the vanilla 0s. The boundaries between these layers are called domain walls. Researchers manipulate the information stored on the nanowire using an electrical current to push the domain walls, and the information they enclose, through the wire and past immobile read and write heads.

 

Interpretations of experiments seeking to measure how domain walls move have largely ignored the effects of "disorder"—usually the result of defects or impurities in the structure of the nanowires. To see how disorder affects the motion of these microscopic magnetic domains, NIST researchers and their colleagues introduced disorder into their computer simulations.

 

Their simulations showed that disorder, which causes friction within the nanowires, can increase the rate at which a current can move domain walls.

 

According to NIST physicist Mark Stiles, friction can cause the domain walls to move faster because they need to lose energy in order to move down the wire.

 

For example, when a gyroscope spins, it resists the force of gravity. If a little friction is introduced into the gyroscope's bearing, the gyroscope will fall over more quickly. Similarly, in the absence of damping, a domain wall will only move from one side of the nanowire to the other. Disorder within the nanowire enables the domain walls to lose energy, which gives them the freedom to "fall" down the length of the wire as they move back and forth.

 

"We can say that the domain wall is moving as if it were in a system that has considerably greater effective damping than the actual damping," says NIST physicist and lead researcher Hongki Min. "This increase in the effective damping is significant enough that it should affect the interpretation of most future domain wall experiments."

Astronomy & Astrophysics is publishing a special feature of 31 articles describing the data gathered by Planck over 15 months of observations and released by ESA and the Planck Collaboration in March 2013. This series of papers presents the initial scientific results extracted from this first Planck dataset.

The Planck satellite was launched in May 2009. With the highest accuracy to date, it measures the remnants of the radiation that filled the Universe immediately after the Big Bang. It is the oldest light in the Universe, emitted when it was 380000 years old. This light is observed today as the cosmic microwave background (CMB). Its maximum intensity is at about 150 GHz (2 mm), and its temperature about 3K. The study of the CMB is currently a very active field of research in cosmology because it provides strong constraints on the cosmological models. In particular, observations of the CMB confirms the key prediction of the Big Bang model and, more precisely, of what cosmologists call the concordance model of cosmology.

Planck was designed to measure the emission from the entire sky at nine distinct wavelengths, ranging from the radio (1 cm) to the far-infrared (300 microns). Several distinct sources of emission ─ both of Galactic and extragalactic origin ─ contribute to the features observed in each of the nine images shown here. Radio emissions from the Milky Way are most prominent at the longest wavelengths, and thermal dust emission at the shortest. Other galaxies contribute to the mix, mostly as unresolved sources. In the middle of Planck's wavelength range, the CMB dominates the sky at intermediate and high Galactic latitudes. The spectral and spatial signatures of all these sources are used to extract an all-sky image of the tiny temperature anisotropies of the CMB with unprecedented accuracy. The properties of these fluctuations are used to derive the parameters characterizing our Universe at early times.

Papers II to X in the series describe the huge dataset obtained from the Planck satellite and released in March 2013. Using this dataset, the Planck team established the new "cosmic recipe", i.e., the relative proportions of the Universe's constituent ingredients. Normal matter that makes up stars and galaxies contributes just 4.9% of the energy of the Universe. Dark matter, to date detected only indirectly by its gravitational influence on galaxies and galaxy clusters, is found to make up 26.8%, more than previous estimates. Conversely, dark energy, a mysterious force said to be responsible for accelerating the expansion of the Universe, accounts for 68.3%, less than previously thought. The Planck team also published a new value for the age of the Universe: 13.8 billion years (see Paper XVI).

The Planck team also studied the statistical properties of the CMB in great detail. Papers XXIII, XXIV, and XXVI explore the statistical distribution of its temperature anisotropies. There is no evidence of any deviations from isotropy on small angular scales. While the observations on small and intermediate angular scales agree extremely well with the model predictions, Planck has now provided the first indisputable evidence that the distribution of primordial fluctuations was not the same on all scales and that it comprises more structure than expected at larger scales. One anomalous signal appears as a substantial asymmetry in the CMB signal observed in the two opposite hemispheres of the sky, which is that one of the two hemispheres appears to have a significantly stronger signal on average. Among the other major results, Paper XXIII of the series explores how the Planck data can constrain theories of cosmic inflation; this paper currently puts the tightest constraints on inflation.

The CMB is not only a picture of the Universe taken 13.8 billion years ago, but it was also distorted during its journey because the CMB photons interacted with the large-scale structures that they traveled through (such as galaxy and galaxy clusters). In Paper XVII of the series, the team extracts from the Planck data a map of the gravitational lensing effect visible today in the CMB and covering the whole sky. The map published in this paper provides a new way to probe the evolution of structures in the Universe over its lifetime.

A byproduct of the Planck all-sky maps are catalogs of compact sources. Paper XXIX describes the production of the largest catalog of galaxy clusters based on the Sunyaev-Zeldovich effect, a distortion of the CMB spectrum caused by very energetic electrons in a galaxy cluster, which kick CMB photons to higher energies. This catalog was used to estimate cosmological constraints, as described in Paper XX.

With the 2013 release of the intensity signal measured during the 15 first months of observation, Planck data are providing new major advances in different domains of cosmology and astrophysics. In the very near future, the Planck Collaboration will release a new dataset that includes all of its observations in intensity and in polarization. This new dataset will be a lasting legacy for the community for many years to come.

 

Nate Silver and Richard Feynman walk into a bar and bump into a biologist . . .

While this may sound like the setup to some late-night nerd sketch, researchers have taken this premise and applied it to an increasingly cumbersome problem in modern biology, namely, finding meaning in the rising oceans of genomic data.

In this specific instance, the data comprisesreams of cancer mutations that genome-wide studies are publishing at a dizzying rate. The challenge is finding new and efficient ways to parse the signal from the noise (and there is no shortage of noise).

As a new way to tackle this, a group of scientists have fused the power of statistical physics and artificial intelligence into a mathematical toolkit that can turn cancer-mutation data into multidimensional models that show how specific mutations alter the social networks of proteins in cells. From this they can deduce which mutations among the myriad mutations present in cancer cells might actually play a role in driving disease.

At the core of this new approach is an algorithm based on statistical mechanics, a branch of theoretical physics that describes large phenomena by predicting the macroscopic properties of its microscopic components.

"Here we have found that a fundamental concept in statistical mechanics, which many of us learned as undergraduates in theoretical physics courses and then largely forgot because it didn't apply to our everyday lives as biologists, can be relevant to one of the most difficult problems in cancer genetics," said Peter Sorger, the HMS Otto Krayer Professor of Systems Pharmacology and senior author on the paper.

These findings, which are among the first to be produced from the new Laboratory of Systems Pharmacology (LSP), are published November 2 online in Nature Genetics.

Dark Matter Matters

Many of the most widely studied cancer genes, such as P53 and Ras, were discovered after decades of work by many groups. But today, in the era of high throughput genomics, we have thousands of times more data from thousands of samples. As a result, the sheer volume of catalogued cancer mutations is vast. But not all mutations actually influence tumor behavior. Many appear to be along for the ride, so to speak, and are as a result called "passenger mutations." In order to separate the drivers from the passengers, researchers typically use a kind of "polling" strategy in which they identify the most common mutations, reasoning that those are the significant ones. Only the most promising candidates are then subjected to the detailed and painstaking analysis that has been applied to P53 and Ras.

Mohammed AlQuraishi, an independent HMS Systems Biology fellow associated with the LSP and Sorger lab and lead author of the paper, reasoned that biologists were in dire need of much more biophysically rigorous tools for scouring this data. With a background in genetics, statistics and physics, AlQuraishi realized that biologists can exploit the statistical power from live data sets and marry it to theoretical physics. "It's the way that Silver and Feynman together would do it," he joked.

Statistical mechanics is a precise physical description of how collections of individual molecules give rise to the macroscopic properties we perceive, such as temperature and pressure. AlQuraishi used its core principles as the basis for a platform that would analyze information housed in the Cancer Genome Atlas. As a result he was able to generate detailed schematics of how certain mutations altered the vast, complex cellular world of protein social networks—networks that largely determine a cell's health, or lack thereof. In doing so, he stumbled upon a few unexpected findings.

Again, many cancer mutations are common, and many more cancer mutations are rare—some so rare that they only occur in a handful of patients. AlQuraishi found that common and rare mutations are equally likely to affect the protein network.

"Both kinds of mutations are equally strong," he said. "In both cases, about one percent of the common and one percent of the rare mutations alter the tumor networks we studied. But rare mutations are being largely ignored. We need to start paying attention to them."

For every common mutation, there are approximately four rare ones, so, based on numbers, rare mutations might be much more significant than previously suspected. "That's where much of the action is, in the rare mutations. We've long considered this large universe of rare mutations to be dark matter, but here we have just found that all this dark matter actually matters."

Reproducing Results

The researchers also found that mutations are not really the blunt force that they expected. Rather than knocking out an entire branch of a network, e.g., a neighborhood power outage, or inserting an entirely new character, i.e., a protein, mutations cause a subtle, almost surgically precise, altering of the communication pathway.

"From the perspective of the mutation, it is hard to be so precise," said AlQuraishi. "But cancer can't be too disruptive, or else it might die. It needs to fly under the radar. This subtle altering of networks achieves that objective. Drug companies can exploit this and possibly develop more targeted therapies."

A final area that these findings address is the problem of reproducing published results in the scientific literature. Here, however, the researchers are able to use fundamental physical principles to process datasets from different laboratories (including their own) in a way that removes the false positives and enriches for the true positives. The model is therefore more accurate and reproducible than any single data set.

"We can clean up the experiments by only using data that both the model and experiments agree on," said AlQuraishi.

"In general, much of the problem with irreproducibility in science is a problem of poor statistics," said Sorger. "We addressed that directly here."

Improved theoretical model of photoabsorption of nitrous oxide sheds light on catalytic destruction of stratospheric ozone

New theoretical physics models could help us better grasp the atmospheric chemistry of ozone depletion. Indeed, understanding photoabsorption of nitrous oxide (N2O) — a process which involves the transfer of the energy of a photon to the molecule — matters because a small fraction of N2O reacts with oxygen atoms in the stratosphere to produce among others nitric oxide (NO). The latter participates in the catalytic destruction of ozone (O3). Now, new theoretical work unveils the actual dynamic of the photoabsorption of nitrous oxide molecules. These findings by Mohammad Noh Daud from the University of Malaya, Kuala Lumpur in Malaysia, have just been published in EPJ D. The work has led to new calculations of the probability of an absorption process taking place, also referred to as absorption cross section. These calculations confirm experimental results.

In this study, the author introduces improvements in an established calculation approach, referred to as the ab initio time-dependent method. It helps calculate the absorption cross section, or spectrum, of nitrous oxide. The advantage of this approach is that it immediately yields the energy dependence of a cross section or spectrum from a single calculation. By taking into account key factors such as the correct angular momentum coupling of the molecule and the component of the transition dipole moment vector, the theoretical model of calculated spectrum has produced better results than previously obtained and more closely matches experimental observations.

Daud's calculation thus provides an improved theoretical prediction of how nitrous oxide evolves and breaks down over time, thus contributing to the nitrous oxide photoabsorption process. As such processes occur in a small gap between the absorption band of oxygen and that of ozone, the predicted major dissociation pathway allows us to understand the involvement of nitrous oxide in the formation of ozone at the molecular level.

 

Page 2 of 17