Same class of algorithms used by Google and Netflix can also tell us if distant planetary systems are stable or not

Machine learning is a powerful tool used for a variety of tasks in modern life, from fraud detection and sorting spam in Google, to making movie recommendations on Netflix. 

Now a team of researchers from the University of Toronto Scarborough have developed a novel approach in using it to determine whether planetary systems are stable or not. 

"Machine learning offers a powerful way to tackle a problem in astrophysics, and that's predicting whether planetary systems are stable," says Dan Tamayo, lead author of the research and a postdoctoral fellow in the Centre for Planetary Science at U of T Scarborough.

Machine learning is a form of artificial intelligence that gives computers the ability to learn without having to be constantly programmed for a specific task. The benefit is that it can teach computers to learn and change when exposed to new data, not to mention it's also very efficient. 

The method developed by Tamayo and his team is 1,000 times faster than traditional methods in predicting stability.

"In the past we've been hamstrung in trying to figure out whether planetary systems are stable by methods that couldn't handle the amount of data we were throwing at it," he says. 

It's important to know whether planetary systems are stable or not because it can tell us a great deal about how these systems formed. It can also offer valuable new information about exoplanets that is not offered by current methods of observation. 

There are several current methods of detecting exoplanets that provide information such as the size of the planet and its orbital period, but they may not provide the planet's mass or how elliptical their orbit is, which are all factors that affect stability, notes Tamayo. 

The method developed by Tamayo and his team is the result of a series of workshops at U of T Scarborough covering how machine learning could help tackle specific scientific problems. The research is currently published online in the Astrophysical Journal Letters. 

"What's encouraging is that our findings tell us that investing weeks of computation to train machine learning models is worth it because not only is this tool accurate, it also works much faster," he adds. 

It may also come in handy when analysing data from NASA's Transiting Exoplanet Survey Satellite (TESS) set to launch next year. The two-year mission will focus on discovering new exoplanets by focusing on the brightest stars near our solar system. 

"It could be a useful tool because predicting stability would allow us to learn more about the system, from the upper limits of mass to the eccentricities of these planets," says Tamayo. 

"It could be a very useful tool in better understanding those systems." 

CAPTION Artist's depiction of a collision between two planetary bodies. CREDIT NASA/JPL-Caltech
CAPTION Artist's depiction of a collision between two planetary bodies. CREDIT NASA/JPL-Caltech

Researchers from North Carolina State University and the U.S. Army Research Office have developed a way to integrate novel functional materials onto a computer chip, allowing the creation of new smart devices and systems.

The novel functional materials are oxides, including several types of materials that, until now, could not be integrated onto silicon chips: multiferroic materials, which have both ferroelectric and ferromagnetic properties; topological insulators, which act as insulators in bulk but have conductive properties on their surface; and novel ferroelectric materials. These materials are thought to hold promise for applications including sensors, non-volatile computer memory and microelectromechanical systems, which are better known as MEMS.

“These novel oxides are normally grown on materials that are not compatible with computing devices,” says Jay Narayan, the John C. Fan Distinguished Chair Professor of Materials Science and Engineering at NC State and co-author of a paper describing the work. “We are now able to integrate these materials onto a silicon chip, allowing us to incorporate their functions into electronic devices.”

The approach developed by the researchers allows them to integrate the materials onto two platforms, both of which are compatible with silicon: a titanium nitride platform, for use with nitride-based electronics; and yttria-stabilized zirconia, for use with oxide-based electronics.

Specifically, the researchers developed a suite of thin films that serve as a buffer, connecting the silicon chip to the relevant novel materials. The exact combination of thin films varies, depending on which novel materials are being used.

For example, if using multiferroic materials, researchers use a combination of four different thin films: titanium nitride, magnesium oxide, strontium oxide and lanthanum strontium manganese oxide. But for topological insulators, they would use a combination of only two thin films: magnesium oxide and titanium nitride.

These thin film buffers align with the planes of the crystalline structure in the novel oxide materials, as well as with the planes of the underlying substrate – effectively serving as a communicating layer between the materials.

This approach, called thin film epitaxy, is based on the concept of domain-matching epitaxy, and was first proposed by Narayan in a 2003 paper.

“Integrating these novel materials onto silicon chips makes many things possible,” Narayan says. “For example, this allows us to sense or collect data; to manipulate that data; and to calculate a response – all on one compact chip. This makes for faster, more efficient, lighter devices.”

Another possible application, Narayan says, is the creation of LEDs on silicon chips, to make “smart lights.” Currently, LEDs are made using sapphire substrates, which aren’t directly compatible with computing devices.

“We’ve already patented this integration technology, and are currently looking for industry partners to license it,” Narayan says.

The paper, “Multifunctional epitaxial systems on silicon substrates,” is published online in the journal Applied Physics Reviews. Lead author of the paper is Srinivasa Singamaneni, a postdoctoral researcher at NC State who is also affiliated with the Army Research Office. The paper was co-authored by John Prater of the Army Research Office, who is also an adjunct professor of materials science and engineering at NC State. The work was supported by the Army Research Office under grant number W911NF-04-D-0003, and was done with technical support from NC State’s Analytical Instrumentation Facility.

Garbage, nutrients and tiny animals are pushed around, suspended in the world's oceans by waves invisible to the naked eye according to a new 3-D model developed by mathematicians at the University of Waterloo. The image shows the 3-D model of mode 2 internal waves.

Garbage, nutrients and tiny animals are pushed around, suspended in the world’s oceans by waves invisible to the naked eye according to a new 3-D model developed by mathematicians at the University of Waterloo.

David Deepwell, a graduate student, and Professor Marek Stastna in Waterloo’s Faculty of Mathematics have created a 3-D simulation that showcases how materials such phytoplankton, contaminants, and nutrients move within aquatic ecosystems via underwater bulges called mode-2 internal waves.

The simulation can help researchers understand how internal waves can carry materials over long distances. Their model was presented in the American Institute of Physics’ journal Physics of Fluids last week. 

In the simulation, fluids of different densities are layered like the layers of a cake, creating an environment similar to that found in large aquatic bodies such as oceans and lakes. A middle layer of fluid, known as a pycnocline, over which the layers are closely packed together is created, and it is in this layer that materials tend to be caught. 

"When the fluid behind the gate is mixed and then the gate is removed, the mixed fluid collapses into the stratification because it is both heavier than the top layer and lighter than the bottom one," explained Deepwell, "Adding dye to the mixed fluid while the gate is in place simulates the material we want the mode-2 waves – the bulges in the pycnocline formed once the gate is taken away – to transport. We can then measure the size of the wave, how much dye remains trapped within it, and how well the wave carries its captured material."

Deepwell and Statsna found that the larger the bulge within the pycnocline, the larger the amount of material carried by the mode-2 wave.

While they have discovered an optimal scenario in which the mode-2 internal wave survives and then transports material for as long a distance as possible, the internal waves can also break down due to small regions of instability, called lee instabilities, that form behind the wave. When the mode-2 wave breaks down, material is lost behind the wave.  Ongoing experimental work and simulations are exploring how this type of wave interacts with underwater topography like sea mounts.

DSCOVR is the linchpin of next-generation space weather forecasts

NOAA’s first space weather satellite, DSCOVR, has completed instrument validation and will go operational on July 27, when it will take over the role of monitoring potentially damaging space weather storms as they approach Earth.

DSCOVR, which stands for Deep Space Climate Observatory, brings improved measurements and higher quality data than currently available, giving forecasters better information with which to issue critical space weather warnings and alerts. 

“Even though the sun is 93 million miles away, activity on the surface of the sun can have significant impacts here on Earth,” said Tom Berger, director of NOAA’s Space Weather Prediction Center. “Severe space weather can disrupt power grids, marine and aviation navigation, satellite operations, GPS systems and communication technologies. DSCOVR will allow us to deliver more  timely, accurate, and actionable geomagnetic storm warnings, giving people time to prevent damage and disruption of important technological systems.” 

DSCOVR’s primary space weather sensors are the Faraday Cup plasma sensor, which measures the speed, density and temperature of the solar wind, and a magnetometer, which measures the strength and direction of the solar wind magnetic field. Together, the instruments provide forecasters with the necessary information to issue geomagnetic storm warnings. 

Data from DSCOVR’s instrumentation will provide better information to forecasters and allow existing and future forecast models to run more reliably. The improvements also will open new opportunities for researchers to better understand coronal mass ejections along with high-speed solar wind and shocks, and to find ways to improve space weather forecasting.

DSCOVR data will be used in a new forecast model – the Geospace Model – due to come on line this year. The Geospace Model will enable forecasters to issue regional, short-term space weather forecasts for the first time, including predictions on the timing and strength of a solar storm that will impact Earth. Currently, the Space Weather Prediction Center issues a single forecast for the entire planet. 

DSCOVR replaces NASA’s aging research satellite, the Advanced Composition Explorer, also known as ACE. DSCOVR was launched on February 11, 2015, and reached final orbit on June 8. It is located one million miles from Earth, where the gravitational influence of the sun and the Earth are in equilibrium. DSCOVR serves as a distant early-warning sentinel, like a tsunami buoy in space, to alert NOAA of incoming eruptions from the sun. 

Data from DSCOVR, which will be available to the public in real-time online, will allow forecasters to provide space weather warnings and alerts up to an hour before a surge of particles and magnetic field generated by solar storms hit Earth. The Space Weather Prediction Center (SWPC) provides forecast products and data to infrastructure operators and the general public through its email alert service to more than 47,000 subscribers. Owners or operators of potentially vulnerable technologies can then take steps to protect their equipment or services. 

SWPC, the nation’s official source for space weather forecasts, watches, warnings, and alerts, operates 24-hours a day, seven day a week from Boulder, Colo. SWPC is one of nine centers under NOAA’s National Centers for Environmental Prediction.

We are witnessing an increased interest in intelligent systems; the quest for making machines to help, replace and act as humans has been for some time around. 

However, only now the state of development of the technology made it possible to be approached so close that we are already surrounded by some products and will be further more in the near future which really have elements of "intelligence"; how to do this, what we can achieve and where the next steps will be? The answers to these questions one can find in this unique set of two volumes, 5 parts and 25+ chapters blended into a single Handbook -- something any student, graduate, researcher and expert must have. 

Computational Intelligence is a set of computational techniques, methodologies, algorithms, systems that borrow from nature (human brain, individual self-development and evolution, populational evolution, reasoning by humans, etc.) and solve complex problems that are other wise not completely or fully addressed by the traditional first principles type of approaches, statistics, etc. The main branches of it include fuzzy sets and systems, artificial neural networks and evolutionary computation.

"This is a one stop shop written by experts in a very accessible way yet presenting the latest cutting edge results," said Professor Plamen Angelov, who is the editor of the Handbook. 

Page 1 of 61