Chinese University of Hong Kong built machine learning accelerates the search for promising Moon sites for energy, mineral resources

A Moon-scanning method that can automatically classify important lunar features from telescope images could significantly improve the efficiency of selecting sites for exploration. Machine learning can be used to rapidly identify and classify craters and rilles on the Moon from telescope images.© 2021 NASA

There is more than meets the eye to picking a landing or exploration site on the Moon. The visible area of the lunar surface is larger than Russia and is pockmarked by thousands of craters and crisscrossed by canyon-like rilles. The choice of future landing and exploration sites may come down to the most promising prospective locations for construction, minerals, or potential energy resources. However, scanning by eye across such a large area, looking for features perhaps a few hundred meters across, is laborious and often inaccurate, which makes it difficult to pick optimal areas for exploration. 

Siyuan Chen, Xin Gao, and Shuyu Sun, along with colleagues from The Chinese University of Hong Kong, have now applied machine learning and artificial intelligence (AI) to automate the identification of prospective lunar landing and exploration areas.

“We are looking for lunar features like craters and rilles, which are thought to be hotspots for energy resources like uranium and helium-3 — a promising resource for nuclear fusion,” says Chen. “Both have been detected in Moon craters and could be useful resources for replenishing spacecraft fuel.”

Machine learning is a very effective technique for training an AI model to look for certain features on its own. The first problem faced by Chen and his colleagues was that there was no labeled dataset for rilles that could be used to train their model.

“We overcame this challenge by constructing our own training dataset with annotations for both craters and rilles,” says Chen. “To do this, we used an approach called transfer learning to pretrain our rille model on a surface crack dataset with some fine-tuning using actual rille masks. Previous approaches require manual annotation for at least part of the input images —our approach does not require human intervention and so allowed us to construct a large-high-quality dataset.”

The next challenge was developing a computational approach that could be used to identify both craters and rilles at the same time, something that had not been done before.

“This is a pixel-to-pixel problem for which we need to accurately mask the craters and rilles in a lunar image,” says Chen. “We solved this problem by constructing a deep learning framework called high-resolution-moon-net, which has two independent networks that share the same network architecture to identify craters and rilles simultaneously.”

The team’s approach achieved precision as high as 83.7 percent, higher than existing state-of-the-art methods for crater detection.

Japanese built machine learning decodes noisy data to predict cell growth

Japanese scientists from The University of Tokyo Institute of Industrial Science have designed a machine-learning algorithm to predict the size of an individual cell as it grows and divides. By using an artificial neural network that does not impose the assumptions commonly employed in biology, the computer was able to make more complex and accurate forecasts than previously possible. This work may help advance the field of quantitative biology as well as improve the industrial production of medications or fermented products.

As in all of the natural sciences, biology has developed mathematical models to help fit data and make predictions. However, many of these equations rely on simplifying assumptions that do not always reflect the actual underlying biological processes because of the inherent complexities of living systems. Now, researchers at The University of Tokyo Institute of Industrial Science have implemented a machine learning algorithm that can use the measured size of single cells over time to predict their future size. Because the computer automatically recognizes patterns in the data, it is not constrained like conventional methods. Researchers at The University of Tokyo Institute of Industrial Science use artificial intelligence to predict the size of cells over time without the need for simplifying assumptions, which may lead to a new understanding of microbiology principles and improved drug manufacturing from recombinant bacteria

"In biology, simple models are often used based on their capacity to reproduce the measured data," first author Atsushi Kamimura says. "However, the models may fail to capture what is really going on because of human preconceptions."

The data for this latest study were collected from either an Escherichia coli bacterium or a Schizosaccharomyces pombe yeast cell held in a microfluidic channel at various temperatures. The plot of size overtime looked like a "sawtooth" as exponential growth was interrupted by division events. Human biologists usually use a "sizer" model, based on the absolute size of the cell, or an "adder" model, based on the increase in size since birth, to predict when divisions will occur. The computer algorithm found support for the "adder" principle and a complex web of biochemical reactions and signaling.

"Our deep-learning neural network can effectively separate the history-dependent deterministic factors from the noise in given data," senior author Tetsuya Kobayashi says.

This method can be extended to many other aspects of biology besides predicting cell size. In the future, life science may be driven more by objective artificial intelligence than human models. This may lead to more efficient control of microorganisms we use to ferment products and produce drugs.

Harvard-led physicists take big step in race to quantum supercomputing

The team develops a simulator with 256 qubits, the largest of its kind ever created

A team of physicists from the Harvard-MIT Center for Ultracold Atoms and other universities has developed a special type of quantum supercomputer known as a programmable quantum simulator capable of operating with 256 quantum bits, or "qubits." Dolev Bluvstein (from left), Mikhail Lukin, and Sepehr Ebadi developed a special type of quantum computer known as a programmable quantum simulator. Ebadi is aligning the device that allows them to create the programmable optical tweezers.  CREDIT Rose Lincoln/Harvard Staff Photographer

The system marks a major step toward building large-scale quantum machines that could be used to shed light on a host of complex quantum processes and eventually help bring about real-world breakthroughs in material science, communication technologies, finance, and many other fields, overcoming research hurdles that are beyond the capabilities of even the fastest supercomputers today. Qubits are the fundamental building blocks on which quantum supercomputers run and the source of their massive processing power.

"This moves the field into a new domain where no one has ever been to thus far," said Mikhail Lukin, the George Vasmer Leverett Professor of Physics, co-director of the Harvard Quantum Initiative, and one of the senior authors of the study published today in the journal Nature. "We are entering a completely new part of the quantum world."

According to Sepehr Ebadi, a physics student in the Graduate School of Arts and Sciences and the study's lead author, it is the combination of system's unprecedented size and programmability that puts it at the cutting edge of the race for a quantum supercomputer, which harnesses the mysterious properties of matter at extremely small scales to greatly advance processing power. Under the right circumstances, the increase in qubits means the system can store and process exponentially more information than the classical bits on which standard computers run.

"The number of quantum states that are possible with only 256 qubits exceeds the number of atoms in the solar system," Ebadi said, explaining the system's vast size.

Already, the simulator has allowed researchers to observe several exotic quantum states of matter that had never before been realized experimentally, and to perform a quantum phase transition study so precise that it serves as the textbook example of how magnetism works at the quantum level.

These experiments provide powerful insights into the quantum physics underlying material properties and can help show scientists how to design new materials with exotic properties.

The project uses a significantly upgraded version of a platform the researchers developed in 2017, which was capable of reaching a size of 51 qubits. That older system allowed the researchers to capture ultra-cold rubidium atoms and arrange them in a specific order using a one-dimensional array of individually focused laser beams called optical tweezers.

This new system allows the atoms to be assembled in two-dimensional arrays of optical tweezers. This increases the achievable system size from 51 to 256 qubits. Using the tweezers, researchers can arrange the atoms in defect-free patterns and create programmable shapes like square, honeycomb, or triangular lattices to engineer different interactions between the qubits.

"The workhorse of this new platform is a device called the spatial light modulator, which is used to shape an optical wavefront to produce hundreds of individually focused optical tweezer beams," said Ebadi. "These devices are essentially the same as what is used inside a computer projector to display images on a screen, but we have adapted them to be a critical component of our quantum simulator."

The initial loading of the atoms into the optical tweezers is random, and the researchers must move the atoms around to arrange them into their target geometries. The researchers use a second set of moving optical tweezers to drag the atoms to their desired locations, eliminating the initial randomness. Lasers give the researchers complete control over the positioning of the atomic qubits and their coherent quantum manipulation.

Other senior authors of the study include Harvard Professors Subir Sachdev and Markus Greiner, who worked on the project along with Massachusetts Institute of Technology Professor Vladan Vuletić, and scientists from Stanford, the University of California Berkeley, the University of Innsbruck in Austria, the Austrian Academy of Sciences, and QuEra Computing Inc. in Boston.

"Our work is part of a really intense, high-visibility global race to build bigger and better quantum computers," said Tout Wang, a research associate in physics at Harvard and one of the paper's authors. "The overall effort [beyond our own] has top academic research institutions involved and major private-sector investment from Google, IBM, Amazon, and many others."

The researchers are currently working to improve the system by improving laser control over qubits and making the system more programmable. They are also actively exploring how the system can be used for new applications, ranging from probing exotic forms of quantum matter to solving challenging real-world problems that can be naturally encoded on the qubits.

"This work enables a vast number of new scientific directions," Ebadi said. "We are nowhere near the limits of what can be done with these systems."