CAPTION This is the prototype of an artificial neural network based on a hybrid analog-digital electronic circuit and a memristive chip. CREDIT Elena Emel'janova

Living cell culture learning process to be implemented for the first time 

Lobachevsky University scientists under the supervision of Alexey Mikhailov, Head of the UNN PTRI Laboratory of Thin Film Physics and Technology, are working to develop an adaptive neural interface that combines, on the one hand, a living culture, and on the other, a neural network based on memristors. This project is one if the first attempts to combine living biological culture with a bio-like neural network based on memristors. Memristor neural networks will be linked to a multi-electrode system for recording and stimulating the bioelectrical activity of a neuron culture that performs the function of analyzing and classifying the network dynamics of living cells.

Compared with some international competitors who set the task of "connecting the living world and artificial architectures" (for example, the RAMP project), the advantage of the UNN project is that highly skilled experts in various fields (including physics and technology of memristive nanostructures, neural network modeling, electronic circuit design, neurodynamics and neurobiology) are concentrated both in terms of their location and organization within the same university.

According to Alexey Mikhailov, UNN scientists are now working to create a neural network prototype based on memristors, which is similar to a biological nervous system with regard to its internal structure and functionality.

"Due to the locality of the memristive effect (such phenomena occur at the nanoscale) and the use of modern standard microelectronic technologies, it will be possible to obtain a large number of neurons and synapses on a single chip. These are our long-time prospects for the future. It means, in fact, that one can "grow" the human brain on a chip. At present, we are doing something on a simpler scale: we are trying to create hybrid electronic circuits where some functions are implemented on the basis of traditional electronics (transistors), and some new functions that are difficult to implement in hardware are realized on the basis of memristors", said Alexey Mikhailov.

Currently, researchers are exploring the possibility of constructing a feedback whereby the output signal from the memristor network will be used to stimulate the biological network. Actually, it means that for the first time the process of learning will be realized for a living cell culture. The living culture used by the scientists is an artificially grown neuronal culture of brain cells. In principle, however, one can also use a slice of living tissue.

The aim of the project is to create compact electronic devices based on memristors that reproduce the property of synaptic plasticity and function as part of bio-like neural networks in conjunction with living biological cultures.

The use of hybrid neural networks based on memristors opens up amazing prospects. First, with the help of memristors it will be possible to implement the computing power of modern supercomputers on a single chip. Secondly, it will be possible to create robots that manage an artificially grown neuronal culture. Thirdly, such "brain-like" electronic systems can be used to replace parts of the living nervous system in the event of their damage or disease.

The project's tasks of creating electronic models of artificial neural networks (ANN), as well as the integration of memristive architectures into the systems for recording and processing the activity of living biological neural network structures are fully in line with the current world trends and priorities in the development of neuromorphic systems.

The balance in the combination of different approaches is the key to successful development and sustainability of the project. The first (and the main) of these approaches is to demonstrate the potential of the "traditional" ANN in the form of a two-layer perceptron based on programmable memristive elements. The key advantages of the artificial neural network being developed include, first of all, its multilayer structure, and hence the ability to solve nonlinear classification problems (based on the shape of the input signal), which is very important when dealing with complex bioelectric activity, and secondly, the hardware implementation of all artificial network elements on one board, including the memristive synaptic chip, control electronics and neuron circuits. In the future, this arrangement will allow us to implement the adaptive neural interface "living neural network - memristive ANN" in the form of a compact autonomous device.

The second approach that the researchers are pursuing in parallel is to find some alternative solutions for creating non-traditional neural network architectures where the stochastic nature and the "live" dynamics of memristive devices play a key role. These features of memristors make it possible to use them for direct processing and analysis of nerve cell activity, as well as for developing plausible physical models of spiking neural networks with self-organization of memristive connections between neurons. These results make an important contribution to the achievement of the project goal and lay a groundwork for the transition to a qualitatively new level in the field of bio-like memrisive systems.

 DSI Professors Tian Zheng and Shaw-Hwa Lo with DOT officials.

A team of statisticians from the Data Science Institute (DSI) received a National Science Foundation grant ($900,000) to develop a statistical method that will help researchers who work with big data make better predictions.

The team's method establishes statistical foundations for measuring "predictivity," the ability of a researcher to make predictions based on big data. The novel approach allows researchers to compare their predictions to a theoretical baseline, which will give their predictions greater accuracy. The method will also help statisticians and policy experts contend with complex social problems, for which big data sets are often difficult to assess.

The DSI team, led by DSI professors Shaw-Hwa Lo and Tian Zheng, is collaborating with New York City Department of Transportation (DOT) on Vision Zero, an initiative to end traffic deaths in the city. DOT collects big data from collisions to analyze the multiple factors that relate to traffic crashes. The potential interactions between the variables and datasets are extremely complex, which led to DOT's interest in working with the DSI team and using its statistical approach.

Lo, a professor of statistics and an affiliate of DSI, said, "we are developing a statistical way to evaluate performance of prediction methods that will be of immense help to DOT. Our method will help DOT identify key combinations of factors and intervention measures to predict where and when crashes are likely to occur."

Statistics can be difficult for the common reader to understand, but in general terms the new method can identify the variable with the highest "predictivity" in large data sets, explained Lo.Current statistical models consider a large number of X variables for predicting a Y variable, and selecting the likely small number of X variables most helpful to predict Y is the goal. But that goal is difficult to reach if the X variables interact in complicated ways. The new method, however, identifies groups of X variables that, once combined, have a stronger ability to predict. Statisticians thus no longer need to apply techniques such as cross validation with the Y variable to evaluate the predictive ability of X variables.

The DSI team will use its new method to help DOT identify risk factors for dangerous roads. It is often difficult to identify the potential risk factors and interactions that lead to the specific crash characteristics of high-crash roadways. The new statistical method, however, will allow DOT to account for all traffic variables, leading to better traffic assessments and enhanced public safety.

"We are excited to collaborate with Professors Lo and Zheng and the Data Science Institute to explore new, innovative research in statistical learning through the analysis of large and diverse transportation and safety datasets," said Seth Hostetter, Director, Safety Analytics and Mapping for DOT. "This is an excellent opportunity to explore the complex interactions between the various risk factors associated with traffic safety that may provide insights that will help us accelerate our progress in achieving the traffic safety goals of Vision Zero."

Zheng, a professor of statistics at Columbia and associate director of education at DSI, said the statistics team is happy to support the work of DOT.

"We are thrilled to be collaborating with DOT on this important project," said Zheng. "Vision Zero aims to end traffic fatalities and we are delighted that DOT is using our new statistical method to further that noble goal."

CAPTION Untangling quadrilateral meshes using locally injective mappings. CREDIT Krishnan Suresh

The supercomputer simulations used to design, optimize, test or control a vast range of objects and products in our daily lives are underpinned by finite element methods.

Finite element simulations use a mesh of geometric shapes -- triangles, tetrahedra, quadrilaterals or hexahedra, for instance. These shapes can be combined to form a mesh that approximates the geometry of a model. For example, meshes can be used to model the human knee in biomechanics simulations, create computer-animated movies or help developers bring products, like airplanes and cars, from concept to production more quickly via better prototypes, testing and development.

"When you were a kid you played with LEGOs and thought about building different projects -- like a house," said Suzanne Shontz, associate professor of electrical engineering & computer science at the University of Kansas. "You were basically stacking blocks and building an object. Meshes are a lot like that -- but they're more flexible than cubes. We're building with things like tetrahedra and hexahedra that you can combine to make different kinds of larger shapes. If you're doing an airplane simulation, you'll know the geometry of the airplane, and that determines with which shapes to build it."

But a problem arises with finite element meshes, especially when they're put into motion during a simulation: The shapes can tangle and overlap.

"The most common context for tangled meshes is a simulation involving motion," Shontz said. "Suppose you have a two-dimensional mesh made of triangles. Now focus on one triangle and its three vertices. If one vertex is moved too far to the left with respect to the vertex to its left, this causes the orientation of the triangle to be flipped and triangles to overlap. A tangled mesh is one that contains elements with a mixture of orientations."

The KU researcher said the use of a tangled mesh in a finite element simulation can lead to inaccurate results -- with potentially disastrous consequences in biomechanical design, product development or large-deformation analysis.

"If you try to run such a simulation, you'll get a physically invalid solution," Shontz said. "That will cause a host of problems. Engineers need accurate solutions in making design decisions. With an airplane, the pilot will make decisions about how to fly the plane in turbulent weather; it's crucial that these decisions are based on correct simulation results regarding the weather and the plane's response. When making important medical decisions, a doctor needs to be able to trust that the simulation results for the disease progression or treatment are correct."

For years, researchers have pursued a solution to the tangled mesh problem, proposing solutions like re-meshing, meshfree methods and the finite cell method. But no definitive answer has yet been developed.

With a new $250,000 award from the National Science Foundation, Shontz and her KU colleagues are working with a team at the University of Wisconsin-Madison headed by Krishnan Suresh, a professor of mechanical engineering, to explore new methods for addressing the tangled mesh problem. Suresh's team received a similar $250,000 award from the National Science Foundation for their research.

Shontz already has developed several promising untangling algorithms, but she said it has proven difficult to "completely" untangle a mesh. Working with Suresh, she said she hopes the collaboration might yield a breakthrough.

Under the new grant, Shontz's group will create new constrained optimization methods for mesh untangling to convert "severely tangled meshes into mildly tangled meshes." In the meantime, Suresh's group will hone the finite-cell method to ensure accurate finite-element solutions over these mildly tangled meshes.

"Our part at KU is to develop a method to untangle meshes so they can be used with standard finite element methods," Shontz said. "At the University of Wisconsin-Madison, they're coming up with a finite-element solver that can work on tangled meshes. We're also looking at a hybrid solution that uses some of their research and some of ours."

Among many biomechanics applications, the researchers hope their work could lead to improved untangling of finite element meshes used to model the brains of patients with hydrocephalus. In these patients, large ventricular displacements of the brain can be modeled with finite-element simulations -- but the models often result in tangled meshes.

"With hydrocephalus, the brain has excess fluid buildup from cerebrospinal fluid," Shontz said. "The brain changes shape due to the excess pressure that usually results within the skull. The idea is to be able to run simulations which will help doctors predict which surgery to perform. However, due to the nonlinear deformations of the brain ventricles, the meshes will often become tangled."

As part of the work under the award, the investigators at KU and UW-Madison will exchange teaching modules in the form of prerecorded lectures to be used in graduate-level classes. Shontz will deliver lectures on mesh generation, smoothing, tangling and untangling to students at UW-Madison, while Suresh will provide lectures to KU students on geometric modeling and computational mechanics.

The researchers will also develop a Design, Analyze and Print Workshop to be offered to middle and high school students on the campuses of the two universities.

Graduate students also will receive support and training via the NSF grant.

"There's funding for graduate-student salaries, faculty summer salary support, and conference travel," Shontz said. "We'll also do student exchanges. I'll send a KU graduate student to Wisconsin for a few weeks, and they'll send a student here, too. They'll get exposed to new ideas, so it's a great opportunity for students at both institutions."

A computer-generated image of the building which will house the Quantum Technologies Innovation Centre

The University of Bristol has announced plans to establish the world’s first open access Quantum Technologies Innovation Centre, focusing on taking quantum research from the lab and into the commercial world.

Experts predict that harnessing the quantum world - the behaviour of matter and energy on the atomic and subatomic level - will revolutionize technology by making it faster, smaller, more secure and, ultimately, more useful for a wide variety of applications.

The Government anticipates that quantum technology will be an industry worth £1 billion to the UK economy in the next 10 years, boosting British business and making a real difference to our everyday lives.

The £43 million Quantum Technologies Innovation Centre (QTIC) has been funded in partnership by £15 million from the West of England Local Enterprise Partnership (LEP), £21 million from industrial partners and £7 million from the University of Bristol. It will be based in the University’s new enterprise campus, to be built in the heart of the city. 

More than 200 researchers at the University will work in partnership with companies to develop the prototypes of tomorrow and play a major role in establishing new quantum businesses.  Importantly, the centre will provide affordable specialist incubation facilities for businesses harnessing the quantum advantage to create new products and services.

Airbus is one of QTIC’s leading industrial partners and it seeks to develop applications in the area of satellite communications secured with quantum physics, to use ultra-powerful quantum supercomputing and to adopt sensing beyond the precision of today’s technology.

In its first 10 years, it’s anticipated the centre will lead to 9,000 new jobs and generate almost £300 million for the economy. It will enable the design, development and prototyping of quantum devices for secure communications, new sensors, simulators and ultra-powerful supercomputers.

These new technologies will impact upon society and all major market sectors, including defence, finance, aerospace, energy and information and communications technology (ICT) in ways that cannot yet be predicted.

The University’s Quantum Information Institute is already working on a new generation of machines that exploit quantum physics to radically transform our lives, society and economy, including:

  • Quantum secure communication systems for individuals, corporations and government.
  • Precision at the quantum limit for sensors used in environmental monitoring, biomedical applications and security.Quantum simulators to design new materials, pharmaceuticals and clean energy devices.
  • Ultra-powerful quantum supercomputers to tackle challenges in big data and machine learning.The full-scale facility will open in 2021. Once complete, the centre will include a mixture of specialist labs, incubation facilities, office space, meeting rooms and conference facilities to co-locate industrial engineers and entrepreneurs with University researchers.

It will also boast a talent academy to support the training of apprentice technicians through to PhD qualified quantum engineers and entrepreneurs; an enterprise hub allowing for start-up and early incubation of new businesses; access to a global network of quantum inspired engineers, scientist, venture capital, industrialist and entrepreneurs; and affordable access to outsourced semi-conductor chip fabrication.

The facility will form a key part of the new £300 million enterprise campus next to Bristol Temple Meads train station, sitting alongside research and teaching in the fields of data analytics, cybersecurity, communications and networks, digital health, smart cities, transport, robotics and autonomous systems, and creative digital technologies.

Mustafa Rampuri, Programme Manager for QTIC, said: “QTIC is the world’s first dedicated open access innovation centre facility for developing a broad spectrum of quantum technologies. It provides pay-as-you-go incubator labs and office space, access to state of the art equipment, supported by experts in a range of business, technology and manufacturing areas.

“It’s an ideal time to take these technologies out of the lab and engineer them into commercial products and services, ensuring that the UK and Bristol region is the epicentre of a global quantum revolution.

“The opportunities are vast and very exciting. Our aim is for the facility to be an internationally recognised centre for the engineering and commercialisation of practical integrated quantum technologies, enabling companies from any sector to co-create new products and exploit the quantum advantage.”

Paolo Bianco, R&T Co-operation Manager at Airbus, said: “We are looking forward to working with QTIC and the University of Bristol on quantum topics to support and establish a supply chain for these technologies, essential for our future ability to bring quantum enhanced platforms to market.

“QTIC’s work with the SME community aligns perfectly with Airbus’ aspiration to work with a variety of partners to development such technologies. Our aim is to eventually adopt these in the Airbus portfolio to generate new products and supply customers with leading edge capabilities which grows and future proofs our business.”

Professor Hugh Brady, Vice-Chancellor and President at the University of Bristol, said: “The new Quantum Technologies Innovation Centre embodies our vision for the new campus – a place where we will be working with partners, large and small, to co-create new technologies and bring exciting new ideas to fruition, while building a talent pipeline of graduates who embrace social responsibility as well as opportunity.

“The opportunities presented by quantum technology are endless, with the potential to bring far-reaching benefits to society.  With Bristol recently being named as the UK’s smartest city, I cannot think of a better city to lead the way in this exciting field of research and discovery.”

The Government has shown its commitment to making the UK a ‘go-to’ place for the development and commercialisation of quantum technologies, investing £270 million over five years into a National Quantum Technologies Programme to accelerate the translation of quantum technologies into the marketplace.

The University of Bristol is a major partner in two hubs and predicts it will help to establish over 40 new quantum businesses as a result, which in turn will benefit from the facilities on offer at QTIC.

University of Utah physics and astronomy Distinguished Professor Valy Vardeny, left, and University of Utah electrical and computer engineering professor Ajay Nahata have discovered that a special kind of perovskite, a combination of an organic and inorganic compound that has the same structure as the original mineral, can be layered on a silicon wafer to create a vital component for the communications system of the future. That system would use the terahertz spectrum, the next generation of communications bandwidth that uses light instead of electricity to shuttle data, allowing cellphone and internet users to transfer information a thousand times faster than today. CREDIT Dan Hixon/College of Engineering

A mineral discovered in Russia in the 1830s known as a perovskite holds a key to the next step in ultra-high-speed communications and supercomputing.

Researchers from the University of Utah's departments of electrical and computer engineering and physics and astronomy have discovered that a special kind of perovskite, a combination of an organic and inorganic compound that has the same structure as the original mineral, can be layered on a silicon wafer to create a vital component for the communications system of the future. That system would use the terahertz spectrum, the next generation of communications bandwidth that uses light instead of electricity to shuttle data, allowing cellphone and internet users to transfer information a thousand times faster than today. 

The new research, led by University of Utah electrical and computer engineering professor Ajay Nahata and physics and astronomy Distinguished Professor Valy Vardeny, was published Monday, Nov. 6 in the latest edition of Nature Communications.

The terahertz range is a band between infrared light and radio waves and utilizes frequencies that cover the range from 100 gigahertz to 10,000 gigahertz (a typical cellphone operates at just 2.4 gigahertz). Scientists are studying how to use these light frequencies to transmit data because of its tremendous potential for boosting the speeds of devices such as internet modems or cell phones.

Nahata and Vardeny uncovered an important piece of that puzzle: By depositing a special form of multilayer perovskite onto a silicon wafer, they can modulate terahertz waves passing through it using a simple halogen lamp. Modulating the amplitude of terahertz radiation is important because it is how data in such a communications system would be transmitted.

Previous attempts to do this have usually required the use of an expensive, high-power laser. What makes this demonstration different is that it is not only the lamp power that allows for this modulation but also the specific color of the light. Consequently, they can put different perovskites on the same silicon substrate, where each region could be controlled by different colors from the lamp. This is not easily possible when using conventional semiconductors like silicon.

"Think of it as the difference between something that is binary versus something that has 10 steps," Nahata explains about what this new structure can do. "Silicon responds only to the power in the optical beam but not to the color. It gives you more capabilities to actually do something, say for information processing or whatever the case may be."

Not only does this open the door to turning terahertz technologies into a reality -- resulting in next-generation communications systems and computing that is a thousand times faster -- but the process of layering perovskites on silicon is simple and inexpensive by using a method called "spin casting," in which the material is deposited on the silicon wafer by spinning the wafer and allowing centrifugal force to spread the perovskite evenly.

Vardeny says what's unique about the type of perovskite they are using is that it is both an inorganic material like rock but also organic like a plastic, making it easy to deposit on silicon while also having the optical properties necessary to make this process possible.

"It's a mismatch," he said. "What we call a 'hybrid.'"

Nahata says it's probably at least another 10 years before terahertz technology for communications and computing is used in commercial products, but this new research is a significant milestone to getting there.

"This basic capability is an important step towards getting a full-fledged communications system," Nahata says. "If you want to go from what you're doing today using a modem and standard wireless communications, and then go to a thousand times faster, you're going to have to change the technology dramatically."

Embedding a decision support tool in the hospital electronic health record increases detection of acute kidney injury, reducing its severity and improving survival, according to new research from the University of Pittsburgh and UPMC.

The results, published today in the Journal of the American Society of Nephrology, address one of the most costly and deadly conditions affecting hospitalized patients, providing evidence that supercomputers analyzing changes in renal function can alert doctors of acute kidney injury before the condition is obvious clinically.

"Acute kidney injury strikes one in eight hospitalized patients and, if unchecked, it can lead to serious complications, including the need for dialysis and even death," said senior author John Kellum, M.D., professor of critical care medicine and director of the Center for Critical Care Nephrology at Pitt's School of Medicine. "Our analysis shows that implementation of a clinical decision support system was associated with lower mortality, less need for dialysis and reduced length of hospital stay for patients diagnosed with acute kidney injury, among other benefits."

Acute kidney injury is common in hospitalized patients, particularly those in intensive care units and older adults, and refers to a sudden episode of kidney failure or damage that happens within a few hours or days. It causes a build-up of waste products in the blood that can affect other organs, including the brain, heart and lungs.

While kidney function is monitored using simple blood tests, subtle changes can elude or delay detection of a problem. Failure to recognize and manage acute kidney injury in the early stages can lead to devastating outcomes for patients and increased costs to the health care system. Benefits of earlier detection of acute kidney injury include earlier intervention to mitigate loss of kidney function, and reduced hospital and long-term health care costs as a result of avoiding progression to severe and permanent kidney damage.

In 2013, Kellum's team released a supercomputer program within the electronic health record system across 14 UPMC hospitals. The program monitored levels of blood creatinine, a standard measure of kidney function, over time and analyzed changes in those levels. If the levels rose too high or fast, the program fired an alert in the patient's electronic health record informing doctors that acute kidney injury could be present. It also helped determine the stage of injury based on changes from the patient's baseline kidney function.

To determine what effect, if any, the supercomputer program was having on physician behavior and patient outcomes, Kellum and his colleagues analyzed records from more than half a million patients admitted to UPMC. They started a year before the alert system was deployed, and continued for two years after. Patients with acute kidney injury had a small yet sustained decrease in hospital mortality of 0.8 percent, 0.3-day shorter length of stay and a decrease of 2.7 percent in dialysis rates. Even after adjusting for age and severity of illness, these changes remained highly significant.

In absolute terms, the changes are small, but given the annual frequency of acute kidney injury in hospitalized U.S. patients of about 12 percent - or 2.2 million people - these results would translate into more than 17,000 lives and $1.2 billion saved per year.

"Ultimately, we see this as confirmation that a fairly simple clinical decision support system can make a difference," said co-author Richard Ambrosino, M.D., Ph.D., medical director of clinical decision support and reporting at UPMC's eRecord. "More sophisticated systems are possible and should have an even greater impact."

Kellum, who also is associate director for acute illness at Pitt's Institute for Precision Medicine, plans to make improvements to the clinical decision support application in the future.

"Working with pharmacists to adjust patient medications and machine-learning experts to better predict which patients will be at greatest risk for adverse events, my team and I hope to make an even greater impact on patient outcomes," said Kellum. "Incorporating protein biomarkers and even genomics into the system could one day revolutionize patient care, not just for acute kidney injury, but for other illnesses."

In physics, the conundrum known as the "few-body problem," how three or more interacting particles behave, has bedeviled scientists for centuries. Equations that describe the physics of few-body systems are usually unsolvable and the methods used to find solutions are unstable. There aren't many equations that can probe the wide spectrum of possible few-particle dynamics. A new family of mathematical models for mixtures of quantum particles could help light the way.

"These mathematical models of interacting quantum particles are like lanterns, or islands of simplicity in a sea of complexity and possible dynamics," said Nathan Harshman, American University associate professor of physics and an expert in symmetry and quantum mechanics, who along with his peers created the new models. "They give us something to grip onto to explore the surrounding chaos."

Harshman and his peers describe the work in a paper published in Physical Letters X. Theoretical physicists like Harshman work at the atomic level, aiming to solve the mysteries of the building blocks of life for energy, motion and matter. The new models exhibit a broad array of quantum particle interactions, from stable to chaotic, simple to complex, controllable to uncontrollable, and persistent to transitory. If these models could be constructed in a laboratory, then the control and coherence provided in special, solvable cases could be used as a tool in the next generation of quantum information processing devices, like quantum sensors and quantum supercomputers.

In the last decade or so, physicists have been able to make one-dimensional optical traps for ultracold atoms in the lab. (Only at low temperatures do quantum dynamics emerge.) This led to a flurry of theoretical analyses, as researchers discovered they could make progress on understanding three-dimensional problems by thinking about solutions in terms of simpler, one-dimensional systems.

The researchers' key insight is working in abstract, higher dimensions. The models describe a few ultracold atoms trapped and bouncing back and forth in a one-dimensional trap. The equation describing four quantum particles in one dimension is mathematically equivalent to the equation describing one particle in four dimensions. Each position of this fictional single particle actually corresponds to a specific arrangement of the four real particles. The breakthrough is to use these mathematical results about symmetry to find new, solvable few-body systems, Harshman explained.

By moving particles to a higher dimensional space and choosing the right coordinates, some symmetries become more obvious and more useful. Then, these symmetries can be used to map a system from the higher dimension back into a simpler model in a lower (but abstract) dimension.

Coxeter models, as Harshman calls these symmetric, few-body systems, named for the mathematician H.S.M. Coxeter, can be defined for any number of particles. The particles can have different masses, making them different from previous equations that can only describe particles that have equal mass. In particular, when the particle mass and order are chosen correctly, the system shows integrable (or well-defined) dynamics, which have as many conserved quantities, like energy and momentum, as they have degrees of freedom.

So far, only rarely do solvable few-body systems have experimental applications. What comes next is to implement the Coxeter models in a lab. Harshman and his colleagues are talking with physics experimentalists about how to construct systems with mixed-mass particles as close as possible to integrable systems. As integrable systems allow for greater coherence, the systems they construct could help unravel some of the most complex concepts in physics, like quantum entanglement. Other proposals include using chains solitons (stable clumps of atoms) because the masses of solitons can be controlled in an experiment.

Michael Brotherton, a UW professor of astronomy, played a key role in a study, published in Nature Astronomy, that suggests a newly developed supercomputer model can more accurately explain the diversity of quasar broad emission line regions, which are the clouds of hot, ionized gas that surround the supermassive black holes feeding in the centers of galaxies. This artist’s impression shows how ULAS J1120+0641, a very distant quasar powered by a black hole with a mass 2 billion times that of the sun, may have looked. This quasar is the most distant yet found and is seen as it was just 770 million years after the Big Bang. (European Southern Observatory/M. Kornmesser Photo)

A University of Wyoming researcher played a key role in a study that suggests a newly developed supercomputer model can more accurately explain the diversity of quasar broad emission line regions, which are the clouds of hot, ionized gas that surround the supermassive black holes feeding in the centers of galaxies.

“We are trying to get at more detailed questions about spectral broad-line regions that help us diagnose the black hole mass,” says Michael Brotherton, a UW professor in the Department of Physics and Astronomy. “People don’t know where these broad emission line regions come from or the nature of this gas.” 

The new study, titled “Tidally Disrupted Dusty Clumps as the Origin of Broad Emission Lines in Active Galactic Nuclei,” was published earlier this month in Nature Astronomy, a monthly, online-only, multidisciplinary journal that publishes the most significant research, review and comment at the cutting edge of astronomy, astrophysics and planetary science. 

Jian-Min Wang, from the Chinese Academy of Sciences, was the paper’s lead author. Other contributing authors were from Key Laboratory for Particle Astrophysics Institute of High Energy Physics, National Astronomical Observatories of China and the School of Astronomy of Space Science, all at the Chinese Academy of Sciences; and the School of Astronomy and Space Science at Nanjing University in Nanjing, China.

Brotherton says most current computer models look at symmetrical lines in the spectral broad emission line region in active galactic nuclei (AGN), whereas the new model he helped develop looks at real lines, which are often asymmetrical.

“We see and try to reach a deeper understanding of the broad emission line region, where it comes from, its structure and how it can lead to a better understanding of quasars themselves,” he says. “Our model tries to explain the full range of quasars,” which Brotherton describes with humor as “the fire-breathing, bat-winged, vampire rainbow zebra unicorns of astrological phenomena.”

The black hole’s gravity accelerates the surrounding gas from these quasars to extremely high velocities, Brotherton explains. The gas heats up and, in turn, outshines the entire surrounding galaxy. 

“People think, ‘It’s a black hole. Why is it so bright?’ A black hole is still dark,” he says. “The discs reach such high temperatures that they put out radiation across the electromagnetic spectrum, which includes gamma rays, X-rays, UV, infrared and radio waves. The black hole and surrounding accreting gas the black hole is feeding on is fuel that turns on the quasar.”

The gases, like wispy fires, put out colors of light, described by Brotherton as similar to “giant neon signs in space.” The gases move at thousands of kilometers per second, with the blue-shifted gases moving toward us and the red-shifted gases moving away from us. This effect broadens the lines but doesn’t actually make the gases red or blue, he says.

At the broad emission line region, those separate colors become a spiral of colors, a measure of the velocity of surrounding dust clouds.

The model includes what Brotherton terms “a swarming donut of dusty gas.” Dusty clouds or clumps are contained in this donut that surrounds the quasar discs.

“What we propose happens is these dusty clumps are moving. Some bang into each other and merge, and change velocity,” he says. “Maybe they move into the quasar, where the black hole lives. Some of the clumps spin in from the broad-line region. Some get kicked out.”

The research was supported by the National Key Program for Science and Technology Research and Development, and the Key Research Program of Frontier Sciences at the Chinese Academy of Sciences.

“It is an important first step forward at looking at these emission lines that form the black hole mass,” Brotherton says.

A new early warning system to alert farmers to the risk of disease among their young cattle stock is being developed by experts at The University of Nottingham.

The innovation, dubbed Y-Ware, could save the UK farming industry millions of pounds, while improving health and welfare of animals and reducing the antimicrobial use to treat these diseases.

The £1.13million project is a partnership with farming digitalisation specialists PrognostiX and BT, and is supported by a grant from Innovate UK, the UK Government-funded innovation agency.

Dr Jasmeet Kaler, Associate Professor of Epidemiology and Farm Animal Health currently leading Ruminant Population research in the University’s School of Veterinary Medicine and Science, is the academic lead on the project.

She said: “Improving youngstock health on cattle farms is a key priority for cattle industry and also been identified by industry task force RUMA (responsible use of medicine in agriculture alliance) as one of key targets released last week for antibiotic reduction on cattle farms especially beef. Use of innovative and precision health technologies offer a great solution in this direction. Whilst there has been an increase in availability of various technologies for livestock over the past decade, there are none that target youngstock health and overall very few precision livestock technologies that have been validated in the field and combine various sources of data with multiple transmission protocols to develop algorithms for livestock health and welfare. Our group does impactful cutting-edge research into the health and welfare of UK cattle and sheep, with a special focus on endemic disease in populations. 

“In this project, we are leading data analytics working alongside our partners. We will utilise our domain knowledge with regard to our understanding of disease biology and epidemiology together with various machine learning approaches on the data gathered via sensors. Our overall aim will be to develop an innovative technology that combines different formats of data ,uses application of Internet of Things and advanced analytics for early detection of disease in young stock and thus allow targeted use of antibiotics.” 

Cattle farmers are facing major challenges in remaining profitable while maintaining the high standards of animal welfare demanded by retailers and consumers.

Every year, of the 2.5 million calves that are born, eight per cent of them are born dead or die within 24 hours and a further 15 per cent die in rearing from diarrhoea and pneumonia, costing the UK cattle industry £80 million. The cost of a pneumonia outbreak is £81 per calf and £57 per calf for a diarrhoea outbreak.

Bolus sensors, which sit in an animal’s gut and monitor body temperature or pH, are in widespread use in cattle – but are currently only available for adult cows. Also, many technologies exist on farms that don’t talk to each other which limits the predictive value of such data.

The Y-Ware project is aiming to develop a bolus sensor which could be used in calves as young as 14 weeks, as well as a dashboard that will use machine learning techniques to give farmers an early warning system for health using bolus sensor information and comprehensive information about the animal collected from a range of additional sources including building temperature, humidity, farm and vet records and weight.

All the information would be used to produce baseline data and a specific ‘signature’ for the animal. Unusual changes to this signature, for example, an unexpected rise in 

body temperature, could allow farmers to spot the signs of disease, treat early and quarantine the animal to prevent wider outbreaks among the herd.

The development will allow farmers to more effectively target use of antibiotics to treat these diseases and this will tackle overuse of the drugs which is contributing to the problem of antibiotic resistance in both animals and humans who are exposed to increasing levels through the food chain. 

Y-Ware will develop an Internet of Things (IoT)-based data collection solution including:

  • Specific real-time 24/7 temperature sensor with combined tamper-proof animal ID verification
  • Easy to collect data from a range of incompatible sensors (both wearable and non-wearable) in young stock via wireless technology
  • A fully automated weighing platform to collect data on cattle weight without the need for human intervention
  • A communications hub to collect and process the remote data
  • A web dashboard offering access to customisable reports that will provide farmers and vets with essential information on individuals and groups of animals. This will provide an early warning system for disease, a ‘welfare score’ and detailed antibiotic usage that can be used.

The consortium is made up of specialists in engineering technology, software development, vet epidemiology, cattle health and data science, cloud supercomputing and data analytics.

Alan Beynon, who is a Director of PrognostiX, Director of St David’s Poultry Team and Managing Director of Molecare Farm Vets, said: “This is a very exciting time for veterinarians in practice in all sectors of Agriculture as the pressure to reduce antimicrobials is current and pressing. The use of real-time data to make clinical decisions is an integral part of the where the future will be alongside better diagnostic facilities. We are delighted to be working alongside our dynamic partners Nottingham University and British Telecom.”

Martin Tufft, IoT Director at BT said: “We’re providing expertise around data science and analytics, exploring the data generated from multiple sensors with a view to developing unique algorithms and machine learning techniques to support the project. The application of advance data analytics is key to the success of IoT solutions and we look forward to helping this project provide valuable information for the farming industry.”

  1. Osaka university researchers roll the dice on perovskite interfaces
  2. UM biochemist Prabhakar produces discovery that lights path for alzheimer's research
  3. Tafti lab creates an elusive material to produce a quantum spin liquid
  4. Purdue develops intrachip micro-cooling system for supercomputers
  5. Northeastern University, China's Xu develops machine learning system to identify shapes of red blood cells
  6. SDSU prof Vaidya produces models for HIV drug pharmacodynamics
  7. Los Alamos supercomputers help interpret the latest LIGO findings
  8. Emerson acquires Paradigm
  9. Chinese scientists discover more than 600 new periodic orbits of the famous three body problem
  10. KU Leuven computational biologists develop supercomputer program detects differences between human cells
  11. Seeing the next dimension of computer chips
  12. NOAA scientists produce new insights into how global warming is drying up the North American monsoon
  13. Paradigm launches cloud-based production management solution
  14. SEAS researchers add zero-index waveguide to photonics toolbox
  15. NICT demos world record 53.3 Tb/s switching capacity for data center networks
  16. AI set to revolutionize retail banking, says GlobalData
  17. China builds world's first space-ground integrated quantum communication network
  18. Russian researchers simulate the motion of incompressible liquid
  19. RIT's Lousto maps black hole collisions, gives astronomers hitchhikers guide to help LIGO-Virgo pinpoint mergers
  20. University of Vienna scientists use machine learning to accelerate MD simulation of infrared spectra

Page 3 of 36