latest - News - SC ONLINE NEWS - SC ONLINE NEWS latest - News Latest stories. http://www.supercomputingonline.com Thu, 21 Mar 2013 06:40:14 -0400 Joomla! - Open Source Content Management - Version 2.5.7 en-gb Laser-like photons signal major step towards quantum 'Internet' http://www.supercomputingonline.com/breaking-news/laser-like-photons-signal-major-step-towards-quantum-internet http://www.supercomputingonline.com/breaking-news/laser-like-photons-signal-major-step-towards-quantum-internet It is an artist's impression of distributed qubits (the bright spots) linked to each other via photons (the light beams). The colors of the beams represent that the optical frequency of the photons in each link can be tailored to the needs of the network.

The realization of quantum networks is one of the major challenges of modern physics. Now, new research shows how high-quality photons can be generated from 'solid-state' chips, bringing us closer to the quantum 'internet'.

 

The number of transistors on a microprocessor continues to double every two years, amazingly holding firm to a prediction by Intel co-founder Gordon Moore almost 50 years ago.

 

If this is to continue, conceptual and technical advances harnessing the power of quantum mechanics in microchips will need to be investigated within the next decade. Developing a distributed quantum network is one promising direction pursued by many researchers today.

 

A variety of solid-state systems are currently being investigated as candidates for quantum bits of information, or qubits, as well as a number of approaches to quantum computing protocols, and the race is on for identifying the best combination. One such qubit, a quantum dot, is made of semiconductor nanocrystals embedded in a chip and can be controlled electro-optically.

 

Single photons will form an integral part of distributed quantum networks as flying qubits. First, they are the natural choice for quantum communication, as they carry information quickly and reliably across long distances. Second, they can take part in quantum logic operations, provided all the photons taking part are identical.

 

Unfortunately, the quality of photons generated from solid-state qubits, including quantum dots, can be low due to decoherence mechanisms within the materials. With each emitted photon being distinct from the others, developing a quantum photonic network faces a major roadblock.

 

Now, researchers from the Cavendish Laboratory at Cambridge University have implemented a novel technique to generate single photons with tailored properties from solid-state devices that are identical in quality to lasers. Their research is published today in the journal Nature Communications.

 

As their photon source, the researchers built a semiconductor Schottky diode device containing individually addressable quantum dots. The transitions of quantum dots were used to generate single photons via resonance fluorescence – a technique demonstrated previously by the same team.

 

Under weak excitation, also known as the Heitler regime, the main contribution to photon generation is through elastic scattering. By operating in this way, photon decoherence can be avoided altogether. The researchers were able to quantify how similar these photons are to lasers in terms of coherence and waveform – it turned out they were identical.

 

"Our research has added the concepts of coherent photon shaping and generation to the toolbox of solid-state quantum photonics," said Dr Mete Atature from the Department of Physics, who led the research.

 

"We are now achieving a high-rate of single photons which are identical in quality to lasers with the further advantage of coherently programmable waveform - a significant paradigm shift to the conventional single photon generation via spontaneous decay."

 

There are already protocols proposed for quantum computing and communication which rely on this photon generation scheme, and this work can be extended to other single photon sources as well, such as single molecules, colour centres in diamond and nanowires.

 

"We are at the dawn of quantum-enabled technologies, and quantum computing is one of many thrilling possibilities," added Atature.

 

"Our results in particular suggest that multiple distant qubits in a distributed quantum network can share a highly coherent and programmable photonic interconnect that is liberated from the detrimental properties of the chips. Consequently, the ability to generate quantum entanglement and perform quantum teleportation between distant quantum-dot spin qubits with very high fidelity is now only a matter of time."

]]>
tyler@supercomputingonline.com (Tyler O'Neal) LATEST Tue, 19 Mar 2013 16:01:39 -0400
Record simulations conducted on Lawrence Livermore supercomputer http://www.supercomputingonline.com/breaking-news/record-simulations-conducted-on-lawrence-livermore-supercomputer http://www.supercomputingonline.com/breaking-news/record-simulations-conducted-on-lawrence-livermore-supercomputer

Researchers at Lawrence Livermore National Laboratory have performed record simulations using all 1,572,864 cores of Sequoia, the largest supercomputer in the world. Sequoia, based on IBM BlueGene/Q architecture, is the first machine to exceed one million computational cores. It also is No. 2 on the list of the world's fastest supercomputers, operating at 16.3 petaflops (16.3 quadrillion floating point operations per second).

 

The simulations are the largest particle-in-cell (PIC) code simulations by number of cores ever performed. PIC simulations are used extensively in plasma physics to model the motion of the charged particles, and the electromagnetic interactions between them, that make up ionized matter. High performance computers such as Sequoia enable these codes to follow the simultaneous evolution of tens of billions to trillions of individual particles in highly complex systems.

 

Frederico Fiuza, a physicist and Lawrence Fellow at LLNL, performed the simulations in order to study the interaction of ultra-powerful lasers with dense plasmas in a proposed method to produce fusion energy, the energy source that powers the sun, in a laboratory setting. The method, known as fast ignition, uses lasers capable of delivering more than a petawatt of power (a million billion watts) in a fraction of a billionth of a second to heat compressed deuterium and tritium (DT) fuel to temperatures exceeding the 50 million degrees Celsius needed to initiate fusion reactions and release net energy. The project is part of the U.S. Department of Energy's Office of Fusion Energy Science Program.

 

This method differs from the approach being taken by LLNL's National Ignition Facility to achieve thermonuclear ignition and burn. NIF's approach is called the "central hot spot" scenario, which relies on simultaneous compression and ignition of a spherical fuel capsule in an implosion, much like in a diesel engine. Fast ignition uses the same hardware as the hot spot approach but adds a high-intensity, ultrashort-pulse laser as the "spark" that achieves ignition.

 

The code used in these simulations was OSIRIS, a PIC code that has been developed over more than 10 years in collaboration between the University of California, Los Angeles and Portugal's Instituto Superior Técnico. Using this code, Fiuza demonstrated excellent scaling in parallel performance of OSIRIS to the full 1.6 million cores of Sequoia. By increasing the number of cores for a relatively small problem of fixed size, what computer scientists call "strong scaling," OSIRIS obtained 75 percent efficiency on the full machine. But when the total problem size was increased, what is called "weak scaling," a 97 percent efficiency was achieved.

 

"This means that a simulation that would take an entire year to perform on a medium-size cluster of 4,000 cores can be performed in a single day. Alternatively, problems 400 times greater in size can be simulated in the same amount of time," Fiuza said. "The combination of this unique supercomputer and this highly efficient and scalable code is allowing for transformative research."

 

OSIRIS is routinely used for fundamental science during the test phase of Sequoia in simulations with up to 256,000 cores. These simulations are allowing researchers, for the first time, to model the interaction of realistic fast-ignition-scale lasers with dense plasmas in three dimensions with sufficient speed to explore a large parameter space and optimize the design for ignition. Each simulation evolves the dynamics of more than 100 billion particles for more than 100,000 computational time steps. This is approximately an order of magnitude larger than the previous largest simulations of fast ignition.

 

Sequoia is a National Nuclear Security Administration (NNSA) machine, developed and fielded as part of NNSA's Advanced Simulation and Computing (ASC) program. Sequoia is in preparations to move to classified computing in support of stockpile stewardship.

 

"This historic calculation is an impressive demonstration of the power of high-performance computing to advance our scientific understanding of complex systems," said Bill Goldstein, LLNL's deputy director for Science and Technology. "With simulations like this, we can help transform the outlook for laboratory fusion as a tool for science, energy and stewardship of the nuclear stockpile."

]]>
tyler@supercomputingonline.com (Tyler O'Neal) LATEST Tue, 19 Mar 2013 15:56:16 -0400
Fantastic flash memory combines graphene, molybdenite http://www.supercomputingonline.com/breaking-news/fantastic-flash-memory-combines-graphene-molybdenite http://www.supercomputingonline.com/breaking-news/fantastic-flash-memory-combines-graphene-molybdenite EPFL scientists have combined two materials with advantageous electronic properties -- graphene and molybdenite -- into a flash memory prototype that is promising in terms of performance, size, flexibility and energy consumption.

EPFL scientists have combined two materials with advantageous electronic properties -- graphene and molybdenite -- into a flash memory prototype that is very promising in terms of performance, size, flexibility and energy consumption.

 

After the molybdenite chip, we now have molybdenite flash memory, a significant step forward in the use of this new material in electronics applications. The news is even more impressive because scientists from EPFL's Laboratory of Nanometer Electronics and Structures (LANES) came up with a truly original idea: they combined the advantages of this semiconducting material with those of another amazing material – graphene. The results of their research have recently been published in the journal ACS Nano.

 

Two years ago, the LANES team revealed the promising electronic properties of molybdenite (MoS2), a mineral that is very abundant in nature. Several months later, they demonstrated the possibility of building an efficient molybdenite chip. Today, they've gone further still by using it to develop a flash memory prototype – that is, a cell that can not only store data but also maintain it in the absence of electricity. This is the kind of memory used in digital devices such as cameras, phones, laptop computers, printers, and USB keys.

 

An ideal "energy band"

 

"For our memory model, we combined the unique electronic properties of MoS2 with graphene's amazing conductivity," explains Andras Kis, author of the study and director of LANES.

 

Molybdenite and graphene have many things in common. Both are expected to surpass the physical limitations of our current silicon chips and electronic transistors. Their two-dimensional chemical structure – the fact that they're made up of a layer only a single atom thick – gives them huge potential for miniaturization and mechanical flexibility.

 

Although graphene is a better conductor, molybdenite has advantageous semi-conducting properties. MoS2 has an ideal "energy band" in its electronic structure that graphene does not. This allows it to switch very easily from an "on" to an "off" state, and thus to use less electricity. Used together, the two materials can thus combine their unique advantages.

 

Like a sandwich

 

The transistor prototype developed by LANES was designed using "field effect" geometry, a bit like a sandwich. In the middle, instead of silicon, a thin layer of MoS2 channels electrons. Underneath, the electrodes transmitting electricity to the MoS2 layer are made out of graphene. And on top, the scientists also included an element made up of several layers of graphene; this captures electric charge and thus stores memory.

 

"Combining these two materials enabled us to make great progress in miniaturization, and also using these transistors we can make flexible nanoelectronic devices," explains Kis. The prototype stores a bit of memory, just a like a traditional cell. But according to the scientist, because molybdenite is thinner than silicon and thus more sensitive to charge, it offers great potential for more efficient data storage.

]]>
tyler@supercomputingonline.com (Tyler O'Neal) LATEST Tue, 19 Mar 2013 15:50:44 -0400
Autodesk sponsors international electric vehicle competition http://www.supercomputingonline.com/breaking-news/autodesk-sponsors-international-electric-vehicle-competition http://www.supercomputingonline.com/breaking-news/autodesk-sponsors-international-electric-vehicle-competition Autodesk will be the primary sponsor of Purdue University's third International Collegiate evGrandPrix to be held on May 12, opening day weekend at the Indianapolis Motor Speedway, home of the Indianapolis 500.


The event is an electric go-kart race and engineering design competition involving colleges and universities from around the nation and Europe.


"In sponsoring the evGrandPrix, Autodesk is taking a leadership role in educating the next generation of engineers and technical specialists," said James Caruthers, Reilly Professor of Chemical Engineering and director of the Indiana Advanced Electric Vehicle Training and Education Consortium (I-AEVtec). "The evGrandPrix is not just a go-kart race, but it is really an engineering design competition where the students get points from race placement plus their engineering design, energy efficiency and also community outreach."


Autodesk, a lead sponsor of the evGrandPrix, is providing licenses of Autodesk design and simulation products to all student participants - design and simulation tools that they can use in the competition as well as throughout their tenure at Purdue and the other colleges and universities that are participating in the evGrandPrix.


"Thanks to Autodesk, our students are now using those high-end design tools as part of their education," Caruthers said. "This is a very generous contribution, and one that will go a long way toward preparing our students with the advanced technical skills they'll need to compete for the best engineering jobs when they graduate."


Autodesk also is helping to promote the event across the nation and the world to millions of students who are members of the Autodesk Education Community.


Team members will be able to simulate their go-kart designs with Autodesk's cloud-based services.


"Students can create, visualize, analyze, simulate, and iterate their designs faster and more efficiently by performing computationally intensive simulation tasks in the cloud," said Thom Tremblay, industry manager at Autodesk. "With Autodesk Simulation 360 students can test multiple 'what if' design scenarios in parallel."


Specialized tools from Autodesk help engineers simulate fatigue, stress and cracking, which can help identify areas of potential instability or damage. Participating students are able to access the engineering and design software from the Autodesk Education Community. The site also hosts a variety of learning tools to help foster a stronger fundamental understanding of engineering and sustainable design principles.


"One dimension of the evGrandPrix is a focus on sustainable design," Tremblay said. "Our Autodesk Sustainability Workshop offers online resources that teach the principles and practice of sustainability in engineering and design."


This will be the third year that the Collegiate evGrandPrix has been held at the Indianapolis Motor Speedway.


"The Indianapolis Motor Speedway was founded over 100 years ago to give automotive designers the ultimate facility to test new innovations and help advance vehicle technologies," said Jarrod Krisiloff, senior director of marketing at the Indianapolis Motor Speedway. "In many ways, the evGrandPrix is a continuation of the original mission of the track and we are pleased to host this competition at the Indianapolis Motor Speedway. The evGrandPrix not only showcases the engineering and design experience that these students offer future employers, but it also exposes the students to real-life examples of how they can apply their learning in the automotive and associated technology industries, including IndyCar racing." 


More information on the Collegiate evGrandPrix can be found at http://www.evgrandprix.org

]]>
tyler@supercomputingonline.com (Tyler O'Neal) LATEST Tue, 19 Mar 2013 15:17:10 -0400
Kotura Rolls Out 100G Silicon Photonics Chips With WDM in Dense QSFP Package http://www.supercomputingonline.com/breaking-news/kotura-rolls-out-100g-silicon-photonics-chips-with-wdm-in-dense-qsfp-package http://www.supercomputingonline.com/breaking-news/kotura-rolls-out-100g-silicon-photonics-chips-with-wdm-in-dense-qsfp-package Kotura Rolls Out 100G Silicon Photonics Chips With WDM in Dense QSFP Package

First and Only Company to Offer Both 100G WDM and 100G Parallel Solutions in Industry Standard Form Factor


Today at OFC/NFOEC, Kotura announced a silicon photonics industry first. The company is demonstrating its Optical Engine in a Quad Small Form-factor Pluggable (QSFP) package. Kotura's Optical Engine uses Wavelength Division Multiplexing (WDM), in which different signals can share the same path. Kotura is the only silicon photonics provider to offer WDM and now chalks up another industry first as the only silicon photonics provider to demonstrate WDM in a 100 gigabits per second (Gb/s) 4x25 QSFP package with 3.5 watts of power.


Kotura's Optical Engine provides an inexpensive, small form factor that reduces power consumption and provides a high level of integration. Consuming only 3.5 watts of power, Kotura is addressing the need for green solutions for 100G pipes desired by data centers and supercomputers.


The QSFP package has become the industry standard footprint for 4x10G and 40G Ethernet in data centers as well as 40G and 56G Infiniband. Kotura predicts that the same package will become the industry's volume standard for 100G networks in both data centers and supercomputing applications.


"The QSFP package enables our customers to fit 40 transceivers across the front panel of a switch, providing 10 times more bandwidth than CFP solutions," said Jean-Louis Malinge, Kotura president and CEO. "Because we monolithically integrate WDM and use standard Single Mode Fiber duplex cabling, our solution eliminates the need for expensive parallel fibers. No other silicon photonics provider can offer WDM in a 3.5 watt QSFP package."

{hwdvs-player}id=461|height=340|width=400|tpl=playeronly{/hwdvs-player}
A long-time innovator in WDM, Kotura has integrated all of the 100G optical and opto-electrical functions into two small chips. According to Malinge, the beauty of Kotura's WDM is that it can scale from four channels to many more, on the same chip. At 100G and higher, Kotura's customers need WDM to avoid the use of expensive ribbon fiber, parallel connectors and patch panels. For large data centers, reaches of 30 meters to 2 kilometers are common and expensive ribbon fiber dominates the interconnect fabric costs. For Active Optical Cables and very short-reach links, Kotura also offers a parallel version of its 100G Optical Engine.


"The market for 40G transceivers in QSFP packages has grown much faster than expected," said Vladimir Kozlov, founder and CEO of LightCounting Market Research. "Squeezing 100G in the same QSFP package and reducing power consumption is critical for applications of 100 Gb/s optics in data centers."


This week at OFC, Kotura will be located at booth #3618 and will demonstrate its Optical Engine by appointment. For more about Kotura, please visit www.kotura.com.

]]>
saqib.kazmi@supercomputingonline.com (Saqib Kazmi) LATEST Mon, 18 Mar 2013 18:53:13 -0400
CSTARS awarded $16.5 million over 3 years by Office of Naval Research http://www.supercomputingonline.com/breaking-news/cstars-awarded-165-million-over-3-years-by-office-of-naval-research http://www.supercomputingonline.com/breaking-news/cstars-awarded-165-million-over-3-years-by-office-of-naval-research The University of Miami's (UM) Center for Southeastern Tropical Advanced Remote Sensing (CSTARS) announced today that it has been awarded a contract by the Office of Naval Research to continue collecting, processing and disseminating data from global Synthetic Aperture Radar (SAR) satellite systems. The goal of the project is to provide SAR imagery collected in near-real time to aid in U.S. Navy operations around the world.


The first phase of the grant will allow CSTARS scientists to procure processing terminals that will assist in the development of hardware and software for next generation of commercial imagery. CSTARS will continue to develop its numerous algorithms of image analysis using new imaging modes and insights derived from research and testing of data with the availability of the new satellite sensors.


"We are very pleased to be working with the ONR on this project, which will allow us to continue to provide the U.S. Navy Fleet with valuable images and research products from commercial satellites," said Dr. Hans Graber, UM professor and executive director of CSTARS. "Through this collaboration we will be able to fuse radar and optical data to derive advanced products that will allow us to understand better oceanographic, sea ice and terrestrial processes. With this solidifying support of our infrastructure from the Navy, CSTARS can continue its track record for excellence in research and the education of students using satellite remote sensing data."


Subsequent phases will focus on the implementation of specific research applications – from the determination of oceanographic features such as winds and waves in typhoons and hurricanes, to disaster response. Other applications will include studies of Arctic sea ice and environmental monitoring, as well as mapping and change detection.


CSTARS' researchers and students are working on several ONR funded projects studying dynamic processes at river mouths such as changes caused by strong surface currents and bathymetric features; the impact of melting and freezing cycles of ice distribution in the Marginal Ice Zone and how sea state breaks up ice; as well as understanding intensity changes in typhoons for improved storm forecasting.

]]>
saqib.kazmi@supercomputingonline.com (Saqib Kazmi) LATEST Mon, 18 Mar 2013 18:46:47 -0400
New database to speed genetic discoveries http://www.supercomputingonline.com/breaking-news/new-database-to-speed-genetic-discoveries http://www.supercomputingonline.com/breaking-news/new-database-to-speed-genetic-discoveries

Tool lets any clinician contribute information about patients for analysis


A new online database combining symptoms, family history and genetic sequencing information is speeding the search for diseases caused by a single rogue gene. As described in an article in the May issue of Human Mutation, the database, known as PhenoDB, enables any clinician to document cases of unusual genetic diseases for analysis by researchers at the Johns Hopkins University School of Medicine or the Baylor College of Medicine in Houston. If a review committee agrees that the patient may indeed have a previously unknown genetic disease, the patient and some of his or her family members may be offered free comprehensive genetic testing in an effort to identify the disease culprit.


"PhenoDB is much more useful than I even thought it would be," says Ada Hamosh, M.D., M.P.H., a professor in the McKusick-Nathans Institute of Genetic Medicine at the Johns Hopkins University School of Medicine. "Bringing all of this information together is crucial to figuring out what our genetic variations mean." The database is designed to capture a bevy of standardized information about phenotype, which Hamosh defines as "any characteristic of a person" — symptoms, personal and family health history, appearance, etc.


Hamosh and others developed PhenoDB for the Baylor-Hopkins Center for Mendelian Genomics (BHCMG), a four-year initiative that, together with its counterparts at Yale University and the University of Washington, is charged with uncovering the genetic roots of every disorder caused by a single faulty gene. There are an estimated 3,000 inherited disorders that have been described phenotypically in scientific papers but whose genetic causes have not yet been pinpointed, Hamosh says, but since many single-gene disorders are extremely rare, she suspects that many more have not yet made it into the literature.


The Centers for Mendelian Genomics have a powerful tool at their disposal, known as whole-exome sequencing. Just a few years ago, Hamosh explains, a geneticist trying to diagnose the cause of an inherited disease would have made an educated guess based on the patient's signs and symptoms about which gene might be at fault, and ordered a test of that gene. If the test came back negative for a mutation, she would order a test of a different gene, and so on. But whole-exome sequencing, in which about 90% of a person's genes are sequenced at one time, has been growing steadily cheaper, and it is this tool that the Centers will use to capture genetic sequencing information (whole-genome sequencing is the next step, but it remains too expensive for many uses, Hamosh notes, as it includes all of a person's DNA, most of which contains no genes).


However, making sense of the deluge of data yielded by whole-exome sequencing presents its own challenges. "The average person has tens of thousands of variations from the standard genetic sequence," Hamosh explains, "and we don't know what most of those variations mean." To parse these variations, she says, "one of the things that needs to change is that the lab doing the testing needs to have the whole phenotype, from head to toe." Researchers will then be better equipped to figure out which variations may or may not be relevant to a patient's illness. Another advantage of the database is that it enables colleagues at distant locations — such as Baylor and Johns Hopkins — to securely access the information and collaborate. Hamosh notes that the database enables different users to be afforded different levels of access — for example, a health provider will only be able to see the information he or she has entered — and that information is deidentified to protect patient privacy. In addition, providers must have patients' consent to be included in PhenoDB.


PhenoDB would be useful for any research project that seeks to match genomic information with its phenotypic effects, Hamosh says, and with that in mind, the Baylor-Hopkins Center for Mendelian Genomics has made the PhenoDB software available for free download at http://phenodb.net. She predicts that similar tools will soon be incorporated into electronic health records as well, so that "doctors will have patients' genomic information at their fingertips and can combine that with information about health history, disease symptoms and social situation to practice truly individualized medicine."

]]>
saqib.kazmi@supercomputingonline.com (Saqib Kazmi) LATEST Mon, 18 Mar 2013 18:39:14 -0400
Las Cumbres Observatory: First light at Saao for third 1-meter node of global telescope http://www.supercomputingonline.com/breaking-news/las-cumbres-observatory-first-light-at-saao-for-third-1-meter-node-of-global-telescope http://www.supercomputingonline.com/breaking-news/las-cumbres-observatory-first-light-at-saao-for-third-1-meter-node-of-global-telescope The first truly global telescope came a significant step closer to completion this month with the installation and first light on three new 1-meter telescopes at the South Africa Astronomical Observatory (SAAO) near Sutherland, South Africa. A team of five Las Cumbres engineers, technicians, and a postdoc, convened at Sutherland for three weeks during late February and early March to achieve this feat.


"The South African Astronomical Observatory is pleased to collaborate with the Las Cumbres Observatory Global Telescope project, and we are excited by the prospects for both scientific observations and public outreach activities," Ted Williams, Director of SAAO said.


Las Cumbres Observatory Global Telescope (LCOGT) has installed four other identical 1-meter telescopes to date: an operational prototype at McDonald Observatory near Fort Davis, Texas (April 2012), and a full science node of three telescopes at Cerro Tololo Inter-American Observatory (CTIO) (October 2012).


Annie Hjelstrom, the project engineer responsible for the successful installation, pointed out that, "We had a great installation team, and SAAO and SALT staff were very helpful, but this is also the culmination of eight years of design and development. Each telescope is built, configured, tested, and then dismantled at the Goleta, California headquarters before we put them back together on site."


Usually first light images are fairly dry, and several such images were taken at each of the three telescopes. But for the SAAO node, LCOGT founder and lead engineer Wayne Rosing asked former company intern and accomplished astrophotographer B. J. Fulton to acquire images from three different galaxies using the stripped-down capabilities available during the early engineering phases. Fulton, who is now an astronomy graduate student with the University of Hawaii, produced the images over the first few days of telescope operations.


Edward Gomez, Education and Outreach Director of LCOGT, wrote that Fulton "kept the integrity of a first-light image by not touching up the images in any way." The images were created using command-line tools that work directly on the raw pixel data (e.g., Python, Stiff and ImageMagick). The images do not have flat-fields, darks or biases subtracted. They each combine multiple wavelengths and colors from the use of different filters on the telescope instrument. For example, the images of M104 and M83 each required over 3 hours of exposure time distributed across two Bessel filters with additional exposures using two Sloan filters. For Trumpler 14, Fulton, took over 2 hours of images spread across three Sloan filters. He was able to conduct his observations from Hawaii, while the engineering team in Santa Barbara conducted their tests and the installation team in South Africa completed system tuning and optical collimation steps.


The installation team consisted of Hjelstrom, technicians Mark Crellin from the Birkenhead, England office, Kurt Vander Horst and David Petry from the Goleta office, and astronomy postdoctoral scholar, Abiy Tekola, based at SAAO in Capetown. The telescopes arrived on site on February 18th and were craned into the domes the next day. The telescope in Dome A was assembled with electrical, mirrors, optical tube assembly, and instrument by the end of February 20th, and the telescope went on-sky that night to begin TPOINT runs to set the telescope's polar alignment. The second and third telescopes followed over the next two days.


The trio of telescopes brings the company's total of operational 1-meter telescopes to seven. Two more are planned for mid-May at the Siding Spring Observatory to complete the southern ring, and a second telescope will be installed at McDonald Observatory at roughly year-end to create the first northern node.


A Global Telescope


LCOGT is a private, nonprofit science institute engaged in time domain astrophysics. The LCOGT Science team, led by Science Director Tim Brown, has published extensively on exoplanets, supernovae, and minor planet research, among other research areas. The organization operates the two 2-meter Faulkes Telescopes for which initial capital and operational funding was provided by The Dill Faulkes Educational Trust. LCOGT is now in the process of deploying a global network of 1-meter telescopes.


According to Brown, "The 1-meter telescope network adds a critical astronomical resource. Because the network will span both hemispheres, and because one or more LCOGT nodes will always be in the dark, astronomers can observe from anywhere on earth at nearly any time. Also, these telescopes - robotic, responsive, and numerous - will allow massive but carefully-directed observing campaigns that could never be done before."


About a third of the network science time in the southern hemisphere is dedicated to the astronomy program of the Scottish Universities Physics Alliance, of which St. Andrews University is a member. St. Andrews has worked with LCOGT over the last seven years on an exoplanet identification and characterization program using the Faulkes Telescopes and is expanding that program onto the larger LCOGT network.


LCOGT also has a science partnership with SAAO. SAAO astronomers will be using the telescopes for their science programs within the next couple of months. Tekola points out that SAAO also plans to use part of their share of network time for education and outreach in South Africa and other African countries.


As the LCOGT network expands the organization will make significant amounts of observing time available for educational projects, in addition to the substantial amount of time which will be available for professional scientists. LCOGT works with groups of education and scientific professionals to develop wide reaching partnerships for scientific research, public engagement and citizen science. Successful LCOGT education programs exist in the United States, the United Kingdom, and Australia.


Additional science and education partnerships are available.


"We're very much looking forward to getting the 1-meter network commissioned for science," LCOGT staff astronomer Rachel Street said. "These telescopes are ideal for the exoplanet characterization, supernovae follow-up and solar system studies our teams specialize in."


LCOGT uses ANSYS software.

]]>
saqib.kazmi@supercomputingonline.com (Saqib Kazmi) LATEST Mon, 18 Mar 2013 18:37:48 -0400
Queen Elizabeth Prize for the inventors of the Internet http://www.supercomputingonline.com/breaking-news/queen-elizabeth-prize-for-the-inventors-of-the-internet http://www.supercomputingonline.com/breaking-news/queen-elizabeth-prize-for-the-inventors-of-the-internet

Outstanding achievements of global significance in engineering science will, for the first time, be awarded today, 18 March 2013. With prize money of one million pounds the Royal Academy of Engineering this year honors the inventors of the Internet for their revolutionizing accomplishment. With this, the Queen Elizabeth Prize is the most highly endowed award in the field of engineering science worldwide.


In the early 1990s at the European Research Centre CERN, the British man, Timothy Berners-Lee, developed the HTML language, hypertext transfer HTTP, the first browser and the first web server. This represented the origin of the World Wide Web. Rather than patenting his ideas and technical solutions he made them freely available. Already in the early 1970s the US citizens, Robert Elliot Kahn and Vinton Cerf, and the French man, Louis Pouzin, developed the Transmission Control Protocol (TCP) and the Internet Protocol (IP) which serve for data transmission and distinct addressing in modern Internet. With this they are regarded as the pioneers of internet. Marc Andreessen developed the early Mosaic-Browser, from which the widely used Netscape-browser system developed.


A high-ranking international jury selected the winner from the submitted nominations. Professor Reinhard Hüttl, as President of the National Academy of Science and Engineering acatech and as Scientific Executive Director of the GFZ German Research Centre for Geosciences is one of the jury members. "This prize can be seen as the Nobel Prize for Engineering Science. Every two years, the Royal Academy of Engineering, under the auspices of Queen Elizabeth II, awards breakthroughs in engineering science that change the world. Such a breakthrough is, of course, the internet which not only influences modern technology but also society worldwide."


Indeed Earth Science today is also no longer imaginable without the World Wide Web. "Modern Early Warning Systems such as the Tsunami Early Warning System GITEWS are based on an extremely fast data processing and, in particular, data transfer" explains Prof. Hüttl further. "But also the rapid development in modern geosciences is based, among others, on the fact that huge data amounts are, nowadays, available globally and within a minimum of time for the science community worldwide".


The prize will be handed over personally by the British Queen on 25 June within the framework of a festive ceremony at Buckingham Palace in London. Prof. Hüttl with be attending the event: "I am looking forward to this event as with the Queen Elizabeth Prize, Engineering Sciences will finally experience the appreciation that corresponds to their value for society."

]]>
saqib.kazmi@supercomputingonline.com (Saqib Kazmi) LATEST Mon, 18 Mar 2013 18:34:34 -0400
Models show how deep carbon could return to Earth's surface http://www.supercomputingonline.com/breaking-news/models-show-how-deep-carbon-could-return-to-earths-surface http://www.supercomputingonline.com/breaking-news/models-show-how-deep-carbon-could-return-to-earths-surface This rendering shows a carbonate ion (red/grey) dissolved in water (pink/white) against a backdrop of a cross section of the Earth. New supercomputer simulations show that under pressure deep in the Earth, carbonate could dissolve in water, providing a route for carbon to return to the Earth's surface.

Supercomputer simulations of water under extreme pressure are helping geochemists understand how carbon might be recycled from hundreds of miles below the Earth's surface. The work, by researchers at the University of California, Davis, and Johns Hopkins University, is published March 18 in the journal Proceedings of the National Academy of Sciences.


Carbon compounds are the basis of life, provide most of our fuels and contribute to climate change. The cycling of carbon through the oceans, atmosphere and shallow crust of the Earth has been intensively studied, but little is known about what happens to carbon deep in the Earth.


"We are trying to understand more about whether carbon can be transported in the deep Earth through water-rich fluids," said coauthor Dimitri Sverjensky, professor of earth and planetary sciences at Johns Hopkins University.


There is plenty of water in the mantle, the layer of the planet extending hundreds of miles below the Earth's crust, but little is known about how water behaves under the extreme conditions there -- pressures run to hundreds of tons per square inch and temperatures are over 2,500 F.


Experiments reproducing these conditions are very hard to do, said Giulia Galli, professor of chemistry and physics at UC Davis and co-author on the paper. Geochemists have models to understand the deep Earth, but they have lacked a crucial parameter for water under these conditions: the dielectric constant, which determines how easily minerals will dissolve in water.


"When people use models to understand the Earth, they need to put in the dielectric constant of water -- but there are no data at these depths," Galli said.


Galli and Sverjensky are collaborators in the Deep Carbon Observatory, supported by the Alfred P. Sloan Foundation, which seeks to understand the role of carbon in chemistry and biology deep in the Earth.


Researchers have speculated that carbon, trapped as carbonate in the shells of tiny marine creatures, sinks to the ocean floor and gets carried into the mantle on sinking crustal plates then is recycled and escapes through volcanoes, Sverjensky said. But there has been no mechanism to explain how this might happen.


Ding Pan, a postdoctoral researcher at UC Davis, used supercomputer simulations of water to predict how it behaves under extreme pressure and temperature. The simulations show that the dielectric constant changes significantly. By bringing that new factor into the existing models of water in the mantle, the researchers predict that magnesium carbonate, which is insoluble at the Earth's surface, would at least partially dissolve in water at that depth.


"It has been thought that this remains solid, but we show that at least part of it can dissolve and could return to the surface, possibly through volcanoes," Sverjensky said. "Over geologic timescales, a lot of material can move this way."


Sverjensky said the new modeling work was a "first step" to understanding how carbon deep in the Earth can return to the surface.


Other authors on the paper are Leonard Spanu, a postdoctoral researcher at UC Davis now at the Shell Technology Center in Bangalore, India; and Brandon Harris, research assistant at Johns Hopkins.


Launched in 2009, the Deep Carbon Observatory aims to achieve a better understanding of the "deep carbon cycle," and a more complete understanding of the role of carbon on our planet. The 10-year initiative, supported by the Sloan Foundation and headquartered at the Carnegie Institution of Washington, is organized into four directorates (http://deepcarbon.net/); Galli is co-chair of the Extreme Physics and Chemistry directorate (http://deepcarbon.net/content/extreme-physics-and-chemistry) and Sverjensky is a member of the scientific steering committee.


The work was supported by the U.S. Department of Energy, the Sloan Foundation and by computational resources through the National Science Foundation.
]]>
saqib.kazmi@supercomputingonline.com (Saqib Kazmi) LATEST Mon, 18 Mar 2013 18:25:22 -0400