LATEST

Objects scattered to the inner region of the Solar System by Jupiter’s growth brought most of the water now found on Earth

Objects scattered to the inner region of the Solar System by Jupiter's growth brought most of the water now found on Earth

Equipped with Newton's law of universal gravitation (published in Principia 330 years ago) and powerful computational resources (used to apply the law to more than 10,000 interacting bodies), a young Brazilian researcher and his former postdoctoral supervisor have just proposed a new physical model to explain the origin of water on Earth and the other Earth-like objects in the Solar System.

André Izidoro, from the School of Engineering of Sao Paulo State University in Guaratinguetá, Brazil, explains that the novelty does not lie on the idea that Earth's water came predominantly from asteroids. "What we did was associate the asteroid contribution with the formation of Jupiter. Based on the resulting model, we 'delivered to Earth' amounts of water consistent with currently estimated values," said Izidoro, who is supported)by the Sao Paulo Research Foundation (FAPESP) through its Young Investigators Grants program.

His explanation lies in the article "Origin of water in the inner Solar System: Planetesimals scattered inward during Jupiter and Saturn's rapid gas accretion", jointly signed with the American astrophysicist Sean Raymond, who is currently with the Bordeaux Astrophysics Laboratory in France. The article was published in the planetary science journal Icarus.

Estimates of the amount of water on Earth vary a great deal. If the unit of measurement is terrestrial oceans, some scientists speak of three to four of them, while others estimate dozens. The variation derives from the fact that the amounts of water in the planet's hot mantle and its rocky crust are unknown. In any event, the model proposed covers the full range of estimates.

"First of all, it's important to leave aside the idea that Earth received all its water via the impacts of comets from very distant regions. These 'deliveries' also occurred, but their contributions came later and were far less significant in percentage terms," Izidoro said. "Most of our water came to the region currently occupied by Earth's orbit before the planet was formed."

Pre-history of the Solar System: water-rich protoplanets

To understand how this happened, it is worth restating the scenario defined in the conventional model of the Solar System's formation and then adding the new model for the advent of water. The initial condition is a gigantic cloud of gas and cosmic dust. Owing to some kind of gravitational disturbance or local turbulence, the cloud collapses and is attracted by a specific inner region that becomes a center.

With the accumulation of matter, at about 4.5 billion years ago, the center became so massive and hot that it began the process of nuclear fusion, which transformed it into a star. Meanwhile, the remaining cloud continued to orbit the center and its matter agglutinated to form a disk, which later fragmented to define protoplanetary niches.

"The water-rich region of this disk is estimated to have been located several astronomical units from our Sun. In the inner region, closer to the star, the temperature was too high for water to accumulate except, perhaps, in very small amounts in the form of vapor," Izidoro said.

An astronomical unit (AU) is the average distance from the Earth to the Sun. The region between 1.8 AU and 3.2 AU is currently occupied by the Asteroid Belt, with hundreds of thousands of objects. The asteroids located between 1.8 AU and 2.5 AU are mostly water-poor, whereas those located beyond 2.5 AU are water-rich. The process whereby Jupiter was formed can explain the origin of this division, according to Izidoro.

"The time elapsed between the Sun's formation and the complete dissipation of the gas disk was quite short on the cosmogonic scale: from only 5 million to, at most, 10 million years," he said. "The formations of gas giants as massive as Jupiter and Saturn can only have occurred during this youthful phase of the Solar System, so it was during this phase that Jupiter's rapid growth gravitationally disturbed thousands of water-rich planetesimals, dislodging them from their original orbits."

The traumatic birth of gas giants

Jupiter is believed to have a solid core with a mass equivalent to several times that of Earth. This core is surrounded by a thick and massive layer of gas. Jupiter could only have acquired this wrapping during the solar nebular phase, when the system was forming and a huge amount of gaseous material was available.

The acquisition of this gas by gravitational attraction was very fast because of the great mass of Jupiter's embryo. In the vicinity of the formation of the giant planet, located beyond the "snow line", thousands of planetesimals (rocky bodies similar to asteroids) orbited the center of the disk and, simultaneously, attracted each other.

The rapid increase of Jupiter's mass undermined the fragile gravitational equilibrium of this system with many bodies. Several planetesimals were engulfed by proto-Jupiter. Others were propelled to the outskirts of the Solar System. In addition, a smaller number were hurled into the disk's inner region, delivering water to the material that later formed the terrestrial planets and the Asteroid Belt.

"The period during which the Earth was formed is dated to between 30 million and 150 million years after the Sun's formation," Izidoro said. "When this happened, the region of the disk in which our planet was formed already contained large amounts of water, delivered by the planetesimals scattered by Jupiter and also by Saturn. A small proportion of Earth's water may have arrived later via collisions with comets and asteroids. An even smaller proportion may have been formed locally through endogenous physicochemical processes. But most of it came with the planetesimals."

Model simulates gravitational interference suffered by icycle objects

His argument is supported by the model he built with his former supervisor. "We used supercomputers to simulate the gravitational interactions among multiple bodies by means of numerical integrators in Fortran," he explained. "We introduced a modification to include the effects of the gas present in the medium during the era of planet formation because, in addition to all the gravitational interactions that were going on, the planetesimals were also impacted by the action of what's known as 'gas drag', which is basically a 'wind' blowing in the opposite direction of their movement. The effect is similar to the force perceived by a cyclist in motion as the molecules of air collide with his body."

Owing to gas drag, the initially very elongated orbits of the planetesimals scattered by Jupiter were gradually "circularized". It was this effect that implanted these objects in what is now the Asteroid Belt.

A key parameter in this type of simulation is the total mass of the solar nebula at the start of the process. To arrive at this number, Izidoro and Raymond used a model proposed in the early 1970s that was based on the estimated masses of all the objects currently observed in the Solar System.

To compensate for losses due to matter ejection during the formation of the system, the model corrects the current masses of the different objects such that the proportions of heavy elements (oxygen, carbon, etc.) and light elements (hydrogen, helium, etc.) are equal to those of the Sun. The rationale for this is the hypothesis that the compositions of the gas disk and the Sun were the same. Following these alterations, the estimated mass of the primitive cloud is obtained.

The researchers created a simulation from these parameters, available in the link . A graph is shown in the video; the horizontal axis shows the distance to the Sun in AU. The objects' orbital eccentricities are plotted along the vertical axis. As the animation progresses, it illustrates how the system evolved during the formative stage. The two black dots, located at just under 5.5 AU and a little past 7.0 AU, correspond to Jupiter and Saturn respectively. During the animation, these bodies grow as they accrete gas from the protoplanetary cloud, and their growth destabilizes planetesimals, scattering them in various directions. The different colors assigned to the planetesimals serve merely to show where they were to begin with and how they were scattered. The gray area marks the current position of the Asteroid Belt. Time passes in thousands of years, as shown at the top of the chart.

A second animation adds a key ingrediente, which are the migrations of Jupiter and Saturn to positions nearer the Sun during their growth processes.

All calculations of the gravitational interactions among the bodies were based on Newton's law. Numeric integrators enabled the researchers to calculate the positions of each body at different times, which would be impossible to do for some 10,000 bodies without a supercomputer.

The ZTF took this “first light” image on Nov. 1, 2017, after being installed at the 48-inch Samuel Oschin Telescope at Palomar Observatory. The Horsehead nebula is near center and the Orion nebula is at lower right. The full-resolution version is more than 24,000 pixels by 24,000 pixels. Each ZTF image covers a sky area equal to 247 full moons.Caltech Optical Observatories

The first astronomers had a limited toolkit: their eyes. They could only observe those stars, planets and celestial events bright enough to pick up unassisted. But today's astronomers use increasingly sensitive and sophisticated instruments to view and track a bevy of cosmic wonders, including objects and events that were too dim or distant for their sky-gazing forebears.

On Nov. 14, scientists with the California Institute of Technology, the University of Washington and eight additional partner institutions, announced that the Zwicky Transient Facility, the latest sensitive tool for astrophysical observations in the Northern Hemisphere, has seen "first light" and took its first detailed image of the night sky.

When fully operational in 2018, the ZTF will scan almost the entire northern sky every night. Based at the Palomar Observatory in southern California and operated by Caltech, the ZTF's goal is to use these nightly images to identify "transient" objects that vary between observations -- identifying events ranging from supernovae millions of light years away to near-Earth asteroids.

In 2016, the UW Department of Astronomy formally joined the ZTF team and will help develop new methods to identify the most "interesting" of the millions of changes in the sky -- including new objects -- that the ZTF will detect each night and alert scientists. That way, these high-priority transient objects can be followed up in detail by larger telescopes, including the UW's share of the Apache Point Observatory 3.5-meter telescope.

"UW is a world leader in survey astronomy, and joining the ZTF will deepen our ability to perform cutting-edge science on the ZTF's massive, real-time data stream," said Eric Bellm, a UW assistant professor of astronomy and the ZTF's survey scientist. "One of the strengths of the ZTF is its global collaboration, consisting of experts in the field of time-domain astronomy from institutions around the world."

Identifying, cataloguing and classifying these celestial objects will impact studies of stars, our solar system and the evolution of our universe. The ZTF could also help detect electromagnetic counterparts to gravitational wave sources discovered by Advanced LIGO and Virgo, as other observatories did in August when these detectors picked up gravitational waves from the merger of two neutron stars.

But to unlock this promise, the ZTF requires massive data collection and real-time analysis -- and UW astronomers have a history of meeting such "big data" challenges.

The roots of big data astronomy at the UW stretch back to the Sloan Digital Sky Survey, which used a telescope at the Apache Point Observatory in New Mexico to gather precise data on the "redshift" -- or increasing wavelength -- of galaxies as they move away from each other in the expanding universe. Once properly analyzed, the data helped astronomers create a more accurate 3-D "map" of the observable universe. The UW's survey astronomy group is gathered within the Data Intensive Research in Astrophysics and Cosmology (DIRAC) Institute, which includes scientists in the Department of Astronomy as well as the eScience Institute and the Paul G. Allen School of Computer Science & Engineering.

"It was natural for the UW astronomy department to join the ZTF team, because we have assembled a dedicated team and expertise for 'big data' astronomy, and we have much to learn from ZTF's partnerships and potential discoveries," said UW associate professor of astronomy Mario Juric.

From Earth, the sky is essentially a giant sphere surrounding our planet. That whole sphere has an area of more than 40,000 square degrees. The ZTF utilizes a new high-resolution camera mounted on the Palomar Observatory's existing Samuel Oschin 48-inch Schmidt Telescope. Together these instruments make up the duet that saw first light recently, and after months of fine-tuning they will be able to capture images of 3,750 square degrees each hour.

These images will be an order of magnitude more numerous than those produced by the ZTF's predecessor survey at Palomar. But since these transient objects might fade or change position in the sky, analysis tools must run in near real time as images come in.

"We'll be looking for anything subtle that changes over time," said Bellm. "And given how much of the sky ZTF will image each night, that could be tens of thousands of objects of potential interest identified every few days."

From a data analysis standpoint, these are no easy tasks. But, they're precisely the sorts of tasks that UW astronomers have been working on in preparation for the Large Synoptic Survey Telescope, which is expected to see first light in the next decade. The LSST, located in northern Chile, is another big data project in astrophysics, and is expected to capture images of almost the entire night sky every few days.

"Data from the ZTF surveys will impact nearly all fields of astrophysics, as well as prepare us for the LSST down the line," said Juric.

Carl Kingsford

Method should help scientists understand regulation of gene expression

Computational biologists at Carnegie Mellon University have developed a more accurate supercomputational method for reconstructing the full-length nucleotide sequences of the RNA products in cells, called transcripts, that transform information from a gene into proteins or other gene products.

Their software, called Scallop, will help scientists build a more complete library of RNA transcripts and thus help scientists better understand the regulation of gene expression.

A report on Scallop by Carl Kingsford, associate professor of computational biology, and Mingfu Shao, Lane Fellow in the School of Computer Science's Computational Biology Department, is being published online yesterday by the journal Nature Biotechnology.

Scallop is a so-called transcript assembler, taking fragments of RNA sequences, called reads, that are produced by high-throughput RNA sequencing technologies (RNA-seq), and putting them back together, like pieces of a puzzle, to reconstruct complete RNA transcripts.

"There are many existing assemblers," Shao said, "but these existing methods are still not accurate enough."

When compared to two leading assemblers, StringTie and TransComb, Scallop is 34.5 percent and 36.3 percent more accurate for transcripts consisting of multiple exons - subunits of a gene that encode part of the gene product.

Like other reference-based assemblers, Scallop begins by constructing a graph to organize reads that are mapped to the corresponding locations on the gene's DNA. Many alternative paths exist for connecting the reads together, however, so errors are easily made. Scallop improves its odds by using a novel algorithm to take full advantage of the information from reads that span several exons to guide it to the correct assembly paths.

Scallop proves particularly adept when assembling less abundant RNA transcripts, improving upon the accuracy of StringTie and TransComb by 67.5 percent and 52.3 percent.

The researchers already have released Scallop as open software on the GitHub repository.

"We've had more than 100 downloads already and, based on the feedback we've received, people are really using it," Shao said. "We expect more users now that our paper is out."

CAPTION This is the prototype of an artificial neural network based on a hybrid analog-digital electronic circuit and a memristive chip. CREDIT Elena Emel'janova

Living cell culture learning process to be implemented for the first time 

Lobachevsky University scientists under the supervision of Alexey Mikhailov, Head of the UNN PTRI Laboratory of Thin Film Physics and Technology, are working to develop an adaptive neural interface that combines, on the one hand, a living culture, and on the other, a neural network based on memristors. This project is one if the first attempts to combine living biological culture with a bio-like neural network based on memristors. Memristor neural networks will be linked to a multi-electrode system for recording and stimulating the bioelectrical activity of a neuron culture that performs the function of analyzing and classifying the network dynamics of living cells.

Compared with some international competitors who set the task of "connecting the living world and artificial architectures" (for example, the RAMP project), the advantage of the UNN project is that highly skilled experts in various fields (including physics and technology of memristive nanostructures, neural network modeling, electronic circuit design, neurodynamics and neurobiology) are concentrated both in terms of their location and organization within the same university.

According to Alexey Mikhailov, UNN scientists are now working to create a neural network prototype based on memristors, which is similar to a biological nervous system with regard to its internal structure and functionality.

"Due to the locality of the memristive effect (such phenomena occur at the nanoscale) and the use of modern standard microelectronic technologies, it will be possible to obtain a large number of neurons and synapses on a single chip. These are our long-time prospects for the future. It means, in fact, that one can "grow" the human brain on a chip. At present, we are doing something on a simpler scale: we are trying to create hybrid electronic circuits where some functions are implemented on the basis of traditional electronics (transistors), and some new functions that are difficult to implement in hardware are realized on the basis of memristors", said Alexey Mikhailov.

Currently, researchers are exploring the possibility of constructing a feedback whereby the output signal from the memristor network will be used to stimulate the biological network. Actually, it means that for the first time the process of learning will be realized for a living cell culture. The living culture used by the scientists is an artificially grown neuronal culture of brain cells. In principle, however, one can also use a slice of living tissue.

The aim of the project is to create compact electronic devices based on memristors that reproduce the property of synaptic plasticity and function as part of bio-like neural networks in conjunction with living biological cultures.

The use of hybrid neural networks based on memristors opens up amazing prospects. First, with the help of memristors it will be possible to implement the computing power of modern supercomputers on a single chip. Secondly, it will be possible to create robots that manage an artificially grown neuronal culture. Thirdly, such "brain-like" electronic systems can be used to replace parts of the living nervous system in the event of their damage or disease.

The project's tasks of creating electronic models of artificial neural networks (ANN), as well as the integration of memristive architectures into the systems for recording and processing the activity of living biological neural network structures are fully in line with the current world trends and priorities in the development of neuromorphic systems.

The balance in the combination of different approaches is the key to successful development and sustainability of the project. The first (and the main) of these approaches is to demonstrate the potential of the "traditional" ANN in the form of a two-layer perceptron based on programmable memristive elements. The key advantages of the artificial neural network being developed include, first of all, its multilayer structure, and hence the ability to solve nonlinear classification problems (based on the shape of the input signal), which is very important when dealing with complex bioelectric activity, and secondly, the hardware implementation of all artificial network elements on one board, including the memristive synaptic chip, control electronics and neuron circuits. In the future, this arrangement will allow us to implement the adaptive neural interface "living neural network - memristive ANN" in the form of a compact autonomous device.

The second approach that the researchers are pursuing in parallel is to find some alternative solutions for creating non-traditional neural network architectures where the stochastic nature and the "live" dynamics of memristive devices play a key role. These features of memristors make it possible to use them for direct processing and analysis of nerve cell activity, as well as for developing plausible physical models of spiking neural networks with self-organization of memristive connections between neurons. These results make an important contribution to the achievement of the project goal and lay a groundwork for the transition to a qualitatively new level in the field of bio-like memrisive systems.

 DSI Professors Tian Zheng and Shaw-Hwa Lo with DOT officials.

A team of statisticians from the Data Science Institute (DSI) received a National Science Foundation grant ($900,000) to develop a statistical method that will help researchers who work with big data make better predictions.

The team's method establishes statistical foundations for measuring "predictivity," the ability of a researcher to make predictions based on big data. The novel approach allows researchers to compare their predictions to a theoretical baseline, which will give their predictions greater accuracy. The method will also help statisticians and policy experts contend with complex social problems, for which big data sets are often difficult to assess.

The DSI team, led by DSI professors Shaw-Hwa Lo and Tian Zheng, is collaborating with New York City Department of Transportation (DOT) on Vision Zero, an initiative to end traffic deaths in the city. DOT collects big data from collisions to analyze the multiple factors that relate to traffic crashes. The potential interactions between the variables and datasets are extremely complex, which led to DOT's interest in working with the DSI team and using its statistical approach.

Lo, a professor of statistics and an affiliate of DSI, said, "we are developing a statistical way to evaluate performance of prediction methods that will be of immense help to DOT. Our method will help DOT identify key combinations of factors and intervention measures to predict where and when crashes are likely to occur."

Statistics can be difficult for the common reader to understand, but in general terms the new method can identify the variable with the highest "predictivity" in large data sets, explained Lo.Current statistical models consider a large number of X variables for predicting a Y variable, and selecting the likely small number of X variables most helpful to predict Y is the goal. But that goal is difficult to reach if the X variables interact in complicated ways. The new method, however, identifies groups of X variables that, once combined, have a stronger ability to predict. Statisticians thus no longer need to apply techniques such as cross validation with the Y variable to evaluate the predictive ability of X variables.

The DSI team will use its new method to help DOT identify risk factors for dangerous roads. It is often difficult to identify the potential risk factors and interactions that lead to the specific crash characteristics of high-crash roadways. The new statistical method, however, will allow DOT to account for all traffic variables, leading to better traffic assessments and enhanced public safety.

"We are excited to collaborate with Professors Lo and Zheng and the Data Science Institute to explore new, innovative research in statistical learning through the analysis of large and diverse transportation and safety datasets," said Seth Hostetter, Director, Safety Analytics and Mapping for DOT. "This is an excellent opportunity to explore the complex interactions between the various risk factors associated with traffic safety that may provide insights that will help us accelerate our progress in achieving the traffic safety goals of Vision Zero."

Zheng, a professor of statistics at Columbia and associate director of education at DSI, said the statistics team is happy to support the work of DOT.

"We are thrilled to be collaborating with DOT on this important project," said Zheng. "Vision Zero aims to end traffic fatalities and we are delighted that DOT is using our new statistical method to further that noble goal."

CAPTION Untangling quadrilateral meshes using locally injective mappings. CREDIT Krishnan Suresh

The supercomputer simulations used to design, optimize, test or control a vast range of objects and products in our daily lives are underpinned by finite element methods.

Finite element simulations use a mesh of geometric shapes -- triangles, tetrahedra, quadrilaterals or hexahedra, for instance. These shapes can be combined to form a mesh that approximates the geometry of a model. For example, meshes can be used to model the human knee in biomechanics simulations, create computer-animated movies or help developers bring products, like airplanes and cars, from concept to production more quickly via better prototypes, testing and development.

"When you were a kid you played with LEGOs and thought about building different projects -- like a house," said Suzanne Shontz, associate professor of electrical engineering & computer science at the University of Kansas. "You were basically stacking blocks and building an object. Meshes are a lot like that -- but they're more flexible than cubes. We're building with things like tetrahedra and hexahedra that you can combine to make different kinds of larger shapes. If you're doing an airplane simulation, you'll know the geometry of the airplane, and that determines with which shapes to build it."

But a problem arises with finite element meshes, especially when they're put into motion during a simulation: The shapes can tangle and overlap.

"The most common context for tangled meshes is a simulation involving motion," Shontz said. "Suppose you have a two-dimensional mesh made of triangles. Now focus on one triangle and its three vertices. If one vertex is moved too far to the left with respect to the vertex to its left, this causes the orientation of the triangle to be flipped and triangles to overlap. A tangled mesh is one that contains elements with a mixture of orientations."

The KU researcher said the use of a tangled mesh in a finite element simulation can lead to inaccurate results -- with potentially disastrous consequences in biomechanical design, product development or large-deformation analysis.

"If you try to run such a simulation, you'll get a physically invalid solution," Shontz said. "That will cause a host of problems. Engineers need accurate solutions in making design decisions. With an airplane, the pilot will make decisions about how to fly the plane in turbulent weather; it's crucial that these decisions are based on correct simulation results regarding the weather and the plane's response. When making important medical decisions, a doctor needs to be able to trust that the simulation results for the disease progression or treatment are correct."

For years, researchers have pursued a solution to the tangled mesh problem, proposing solutions like re-meshing, meshfree methods and the finite cell method. But no definitive answer has yet been developed.

With a new $250,000 award from the National Science Foundation, Shontz and her KU colleagues are working with a team at the University of Wisconsin-Madison headed by Krishnan Suresh, a professor of mechanical engineering, to explore new methods for addressing the tangled mesh problem. Suresh's team received a similar $250,000 award from the National Science Foundation for their research.

Shontz already has developed several promising untangling algorithms, but she said it has proven difficult to "completely" untangle a mesh. Working with Suresh, she said she hopes the collaboration might yield a breakthrough.

Under the new grant, Shontz's group will create new constrained optimization methods for mesh untangling to convert "severely tangled meshes into mildly tangled meshes." In the meantime, Suresh's group will hone the finite-cell method to ensure accurate finite-element solutions over these mildly tangled meshes.

"Our part at KU is to develop a method to untangle meshes so they can be used with standard finite element methods," Shontz said. "At the University of Wisconsin-Madison, they're coming up with a finite-element solver that can work on tangled meshes. We're also looking at a hybrid solution that uses some of their research and some of ours."

Among many biomechanics applications, the researchers hope their work could lead to improved untangling of finite element meshes used to model the brains of patients with hydrocephalus. In these patients, large ventricular displacements of the brain can be modeled with finite-element simulations -- but the models often result in tangled meshes.

"With hydrocephalus, the brain has excess fluid buildup from cerebrospinal fluid," Shontz said. "The brain changes shape due to the excess pressure that usually results within the skull. The idea is to be able to run simulations which will help doctors predict which surgery to perform. However, due to the nonlinear deformations of the brain ventricles, the meshes will often become tangled."

As part of the work under the award, the investigators at KU and UW-Madison will exchange teaching modules in the form of prerecorded lectures to be used in graduate-level classes. Shontz will deliver lectures on mesh generation, smoothing, tangling and untangling to students at UW-Madison, while Suresh will provide lectures to KU students on geometric modeling and computational mechanics.

The researchers will also develop a Design, Analyze and Print Workshop to be offered to middle and high school students on the campuses of the two universities.

Graduate students also will receive support and training via the NSF grant.

"There's funding for graduate-student salaries, faculty summer salary support, and conference travel," Shontz said. "We'll also do student exchanges. I'll send a KU graduate student to Wisconsin for a few weeks, and they'll send a student here, too. They'll get exposed to new ideas, so it's a great opportunity for students at both institutions."

A computer-generated image of the building which will house the Quantum Technologies Innovation Centre

The University of Bristol has announced plans to establish the world’s first open access Quantum Technologies Innovation Centre, focusing on taking quantum research from the lab and into the commercial world.

Experts predict that harnessing the quantum world - the behaviour of matter and energy on the atomic and subatomic level - will revolutionize technology by making it faster, smaller, more secure and, ultimately, more useful for a wide variety of applications.

The Government anticipates that quantum technology will be an industry worth £1 billion to the UK economy in the next 10 years, boosting British business and making a real difference to our everyday lives.

The £43 million Quantum Technologies Innovation Centre (QTIC) has been funded in partnership by £15 million from the West of England Local Enterprise Partnership (LEP), £21 million from industrial partners and £7 million from the University of Bristol. It will be based in the University’s new enterprise campus, to be built in the heart of the city. 

More than 200 researchers at the University will work in partnership with companies to develop the prototypes of tomorrow and play a major role in establishing new quantum businesses.  Importantly, the centre will provide affordable specialist incubation facilities for businesses harnessing the quantum advantage to create new products and services.

Airbus is one of QTIC’s leading industrial partners and it seeks to develop applications in the area of satellite communications secured with quantum physics, to use ultra-powerful quantum supercomputing and to adopt sensing beyond the precision of today’s technology.

In its first 10 years, it’s anticipated the centre will lead to 9,000 new jobs and generate almost £300 million for the economy. It will enable the design, development and prototyping of quantum devices for secure communications, new sensors, simulators and ultra-powerful supercomputers.

These new technologies will impact upon society and all major market sectors, including defence, finance, aerospace, energy and information and communications technology (ICT) in ways that cannot yet be predicted.

The University’s Quantum Information Institute is already working on a new generation of machines that exploit quantum physics to radically transform our lives, society and economy, including:

  • Quantum secure communication systems for individuals, corporations and government.
  • Precision at the quantum limit for sensors used in environmental monitoring, biomedical applications and security.Quantum simulators to design new materials, pharmaceuticals and clean energy devices.
  • Ultra-powerful quantum supercomputers to tackle challenges in big data and machine learning.The full-scale facility will open in 2021. Once complete, the centre will include a mixture of specialist labs, incubation facilities, office space, meeting rooms and conference facilities to co-locate industrial engineers and entrepreneurs with University researchers.

It will also boast a talent academy to support the training of apprentice technicians through to PhD qualified quantum engineers and entrepreneurs; an enterprise hub allowing for start-up and early incubation of new businesses; access to a global network of quantum inspired engineers, scientist, venture capital, industrialist and entrepreneurs; and affordable access to outsourced semi-conductor chip fabrication.

The facility will form a key part of the new £300 million enterprise campus next to Bristol Temple Meads train station, sitting alongside research and teaching in the fields of data analytics, cybersecurity, communications and networks, digital health, smart cities, transport, robotics and autonomous systems, and creative digital technologies.

Mustafa Rampuri, Programme Manager for QTIC, said: “QTIC is the world’s first dedicated open access innovation centre facility for developing a broad spectrum of quantum technologies. It provides pay-as-you-go incubator labs and office space, access to state of the art equipment, supported by experts in a range of business, technology and manufacturing areas.

“It’s an ideal time to take these technologies out of the lab and engineer them into commercial products and services, ensuring that the UK and Bristol region is the epicentre of a global quantum revolution.

“The opportunities are vast and very exciting. Our aim is for the facility to be an internationally recognised centre for the engineering and commercialisation of practical integrated quantum technologies, enabling companies from any sector to co-create new products and exploit the quantum advantage.”

Paolo Bianco, R&T Co-operation Manager at Airbus, said: “We are looking forward to working with QTIC and the University of Bristol on quantum topics to support and establish a supply chain for these technologies, essential for our future ability to bring quantum enhanced platforms to market.

“QTIC’s work with the SME community aligns perfectly with Airbus’ aspiration to work with a variety of partners to development such technologies. Our aim is to eventually adopt these in the Airbus portfolio to generate new products and supply customers with leading edge capabilities which grows and future proofs our business.”

Professor Hugh Brady, Vice-Chancellor and President at the University of Bristol, said: “The new Quantum Technologies Innovation Centre embodies our vision for the new campus – a place where we will be working with partners, large and small, to co-create new technologies and bring exciting new ideas to fruition, while building a talent pipeline of graduates who embrace social responsibility as well as opportunity.

“The opportunities presented by quantum technology are endless, with the potential to bring far-reaching benefits to society.  With Bristol recently being named as the UK’s smartest city, I cannot think of a better city to lead the way in this exciting field of research and discovery.”

The Government has shown its commitment to making the UK a ‘go-to’ place for the development and commercialisation of quantum technologies, investing £270 million over five years into a National Quantum Technologies Programme to accelerate the translation of quantum technologies into the marketplace.

The University of Bristol is a major partner in two hubs and predicts it will help to establish over 40 new quantum businesses as a result, which in turn will benefit from the facilities on offer at QTIC.

University of Utah physics and astronomy Distinguished Professor Valy Vardeny, left, and University of Utah electrical and computer engineering professor Ajay Nahata have discovered that a special kind of perovskite, a combination of an organic and inorganic compound that has the same structure as the original mineral, can be layered on a silicon wafer to create a vital component for the communications system of the future. That system would use the terahertz spectrum, the next generation of communications bandwidth that uses light instead of electricity to shuttle data, allowing cellphone and internet users to transfer information a thousand times faster than today. CREDIT Dan Hixon/College of Engineering

A mineral discovered in Russia in the 1830s known as a perovskite holds a key to the next step in ultra-high-speed communications and supercomputing.

Researchers from the University of Utah's departments of electrical and computer engineering and physics and astronomy have discovered that a special kind of perovskite, a combination of an organic and inorganic compound that has the same structure as the original mineral, can be layered on a silicon wafer to create a vital component for the communications system of the future. That system would use the terahertz spectrum, the next generation of communications bandwidth that uses light instead of electricity to shuttle data, allowing cellphone and internet users to transfer information a thousand times faster than today. 

The new research, led by University of Utah electrical and computer engineering professor Ajay Nahata and physics and astronomy Distinguished Professor Valy Vardeny, was published Monday, Nov. 6 in the latest edition of Nature Communications.

The terahertz range is a band between infrared light and radio waves and utilizes frequencies that cover the range from 100 gigahertz to 10,000 gigahertz (a typical cellphone operates at just 2.4 gigahertz). Scientists are studying how to use these light frequencies to transmit data because of its tremendous potential for boosting the speeds of devices such as internet modems or cell phones.

Nahata and Vardeny uncovered an important piece of that puzzle: By depositing a special form of multilayer perovskite onto a silicon wafer, they can modulate terahertz waves passing through it using a simple halogen lamp. Modulating the amplitude of terahertz radiation is important because it is how data in such a communications system would be transmitted.

Previous attempts to do this have usually required the use of an expensive, high-power laser. What makes this demonstration different is that it is not only the lamp power that allows for this modulation but also the specific color of the light. Consequently, they can put different perovskites on the same silicon substrate, where each region could be controlled by different colors from the lamp. This is not easily possible when using conventional semiconductors like silicon.

"Think of it as the difference between something that is binary versus something that has 10 steps," Nahata explains about what this new structure can do. "Silicon responds only to the power in the optical beam but not to the color. It gives you more capabilities to actually do something, say for information processing or whatever the case may be."

Not only does this open the door to turning terahertz technologies into a reality -- resulting in next-generation communications systems and computing that is a thousand times faster -- but the process of layering perovskites on silicon is simple and inexpensive by using a method called "spin casting," in which the material is deposited on the silicon wafer by spinning the wafer and allowing centrifugal force to spread the perovskite evenly.

Vardeny says what's unique about the type of perovskite they are using is that it is both an inorganic material like rock but also organic like a plastic, making it easy to deposit on silicon while also having the optical properties necessary to make this process possible.

"It's a mismatch," he said. "What we call a 'hybrid.'"

Nahata says it's probably at least another 10 years before terahertz technology for communications and computing is used in commercial products, but this new research is a significant milestone to getting there.

"This basic capability is an important step towards getting a full-fledged communications system," Nahata says. "If you want to go from what you're doing today using a modem and standard wireless communications, and then go to a thousand times faster, you're going to have to change the technology dramatically."

Embedding a decision support tool in the hospital electronic health record increases detection of acute kidney injury, reducing its severity and improving survival, according to new research from the University of Pittsburgh and UPMC.

The results, published today in the Journal of the American Society of Nephrology, address one of the most costly and deadly conditions affecting hospitalized patients, providing evidence that supercomputers analyzing changes in renal function can alert doctors of acute kidney injury before the condition is obvious clinically.

"Acute kidney injury strikes one in eight hospitalized patients and, if unchecked, it can lead to serious complications, including the need for dialysis and even death," said senior author John Kellum, M.D., professor of critical care medicine and director of the Center for Critical Care Nephrology at Pitt's School of Medicine. "Our analysis shows that implementation of a clinical decision support system was associated with lower mortality, less need for dialysis and reduced length of hospital stay for patients diagnosed with acute kidney injury, among other benefits."

Acute kidney injury is common in hospitalized patients, particularly those in intensive care units and older adults, and refers to a sudden episode of kidney failure or damage that happens within a few hours or days. It causes a build-up of waste products in the blood that can affect other organs, including the brain, heart and lungs.

While kidney function is monitored using simple blood tests, subtle changes can elude or delay detection of a problem. Failure to recognize and manage acute kidney injury in the early stages can lead to devastating outcomes for patients and increased costs to the health care system. Benefits of earlier detection of acute kidney injury include earlier intervention to mitigate loss of kidney function, and reduced hospital and long-term health care costs as a result of avoiding progression to severe and permanent kidney damage.

In 2013, Kellum's team released a supercomputer program within the electronic health record system across 14 UPMC hospitals. The program monitored levels of blood creatinine, a standard measure of kidney function, over time and analyzed changes in those levels. If the levels rose too high or fast, the program fired an alert in the patient's electronic health record informing doctors that acute kidney injury could be present. It also helped determine the stage of injury based on changes from the patient's baseline kidney function.

To determine what effect, if any, the supercomputer program was having on physician behavior and patient outcomes, Kellum and his colleagues analyzed records from more than half a million patients admitted to UPMC. They started a year before the alert system was deployed, and continued for two years after. Patients with acute kidney injury had a small yet sustained decrease in hospital mortality of 0.8 percent, 0.3-day shorter length of stay and a decrease of 2.7 percent in dialysis rates. Even after adjusting for age and severity of illness, these changes remained highly significant.

In absolute terms, the changes are small, but given the annual frequency of acute kidney injury in hospitalized U.S. patients of about 12 percent - or 2.2 million people - these results would translate into more than 17,000 lives and $1.2 billion saved per year.

"Ultimately, we see this as confirmation that a fairly simple clinical decision support system can make a difference," said co-author Richard Ambrosino, M.D., Ph.D., medical director of clinical decision support and reporting at UPMC's eRecord. "More sophisticated systems are possible and should have an even greater impact."

Kellum, who also is associate director for acute illness at Pitt's Institute for Precision Medicine, plans to make improvements to the clinical decision support application in the future.

"Working with pharmacists to adjust patient medications and machine-learning experts to better predict which patients will be at greatest risk for adverse events, my team and I hope to make an even greater impact on patient outcomes," said Kellum. "Incorporating protein biomarkers and even genomics into the system could one day revolutionize patient care, not just for acute kidney injury, but for other illnesses."

  1. American University prof builds models to help solve few-body problems in physics
  2. UW prof helps supercompute activity around quasars, black holes
  3. Nottingham's early warning health, welfare system could save UK cattle farmers millions of pounds, reduce antibiotic use
  4. Osaka university researchers roll the dice on perovskite interfaces
  5. UM biochemist Prabhakar produces discovery that lights path for alzheimer's research
  6. Tafti lab creates an elusive material to produce a quantum spin liquid
  7. Purdue develops intrachip micro-cooling system for supercomputers
  8. Northeastern University, China's Xu develops machine learning system to identify shapes of red blood cells
  9. SDSU prof Vaidya produces models for HIV drug pharmacodynamics
  10. Los Alamos supercomputers help interpret the latest LIGO findings
  11. Emerson acquires Paradigm
  12. Chinese scientists discover more than 600 new periodic orbits of the famous three body problem
  13. KU Leuven computational biologists develop supercomputer program detects differences between human cells
  14. Seeing the next dimension of computer chips
  15. NOAA scientists produce new insights into how global warming is drying up the North American monsoon
  16. Paradigm launches cloud-based production management solution
  17. SEAS researchers add zero-index waveguide to photonics toolbox
  18. NICT demos world record 53.3 Tb/s switching capacity for data center networks
  19. AI set to revolutionize retail banking, says GlobalData
  20. China builds world's first space-ground integrated quantum communication network

Page 3 of 36