Japanese built machine learning decodes noisy data to predict cell growth

Japanese scientists from The University of Tokyo Institute of Industrial Science have designed a machine-learning algorithm to predict the size of an individual cell as it grows and divides. By using an artificial neural network that does not impose the assumptions commonly employed in biology, the computer was able to make more complex and accurate forecasts than previously possible. This work may help advance the field of quantitative biology as well as improve the industrial production of medications or fermented products.

As in all of the natural sciences, biology has developed mathematical models to help fit data and make predictions. However, many of these equations rely on simplifying assumptions that do not always reflect the actual underlying biological processes because of the inherent complexities of living systems. Now, researchers at The University of Tokyo Institute of Industrial Science have implemented a machine learning algorithm that can use the measured size of single cells over time to predict their future size. Because the computer automatically recognizes patterns in the data, it is not constrained like conventional methods. Researchers at The University of Tokyo Institute of Industrial Science use artificial intelligence to predict the size of cells over time without the need for simplifying assumptions, which may lead to a new understanding of microbiology principles and improved drug manufacturing from recombinant bacteria

"In biology, simple models are often used based on their capacity to reproduce the measured data," first author Atsushi Kamimura says. "However, the models may fail to capture what is really going on because of human preconceptions."

The data for this latest study were collected from either an Escherichia coli bacterium or a Schizosaccharomyces pombe yeast cell held in a microfluidic channel at various temperatures. The plot of size overtime looked like a "sawtooth" as exponential growth was interrupted by division events. Human biologists usually use a "sizer" model, based on the absolute size of the cell, or an "adder" model, based on the increase in size since birth, to predict when divisions will occur. The computer algorithm found support for the "adder" principle and a complex web of biochemical reactions and signaling.

"Our deep-learning neural network can effectively separate the history-dependent deterministic factors from the noise in given data," senior author Tetsuya Kobayashi says.

This method can be extended to many other aspects of biology besides predicting cell size. In the future, life science may be driven more by objective artificial intelligence than human models. This may lead to more efficient control of microorganisms we use to ferment products and produce drugs.

Harvard-led physicists take big step in race to quantum supercomputing

The team develops a simulator with 256 qubits, the largest of its kind ever created

A team of physicists from the Harvard-MIT Center for Ultracold Atoms and other universities has developed a special type of quantum supercomputer known as a programmable quantum simulator capable of operating with 256 quantum bits, or "qubits." Dolev Bluvstein (from left), Mikhail Lukin, and Sepehr Ebadi developed a special type of quantum computer known as a programmable quantum simulator. Ebadi is aligning the device that allows them to create the programmable optical tweezers.  CREDIT Rose Lincoln/Harvard Staff Photographer

The system marks a major step toward building large-scale quantum machines that could be used to shed light on a host of complex quantum processes and eventually help bring about real-world breakthroughs in material science, communication technologies, finance, and many other fields, overcoming research hurdles that are beyond the capabilities of even the fastest supercomputers today. Qubits are the fundamental building blocks on which quantum supercomputers run and the source of their massive processing power.

"This moves the field into a new domain where no one has ever been to thus far," said Mikhail Lukin, the George Vasmer Leverett Professor of Physics, co-director of the Harvard Quantum Initiative, and one of the senior authors of the study published today in the journal Nature. "We are entering a completely new part of the quantum world."

According to Sepehr Ebadi, a physics student in the Graduate School of Arts and Sciences and the study's lead author, it is the combination of system's unprecedented size and programmability that puts it at the cutting edge of the race for a quantum supercomputer, which harnesses the mysterious properties of matter at extremely small scales to greatly advance processing power. Under the right circumstances, the increase in qubits means the system can store and process exponentially more information than the classical bits on which standard computers run.

"The number of quantum states that are possible with only 256 qubits exceeds the number of atoms in the solar system," Ebadi said, explaining the system's vast size.

Already, the simulator has allowed researchers to observe several exotic quantum states of matter that had never before been realized experimentally, and to perform a quantum phase transition study so precise that it serves as the textbook example of how magnetism works at the quantum level.

These experiments provide powerful insights into the quantum physics underlying material properties and can help show scientists how to design new materials with exotic properties.

The project uses a significantly upgraded version of a platform the researchers developed in 2017, which was capable of reaching a size of 51 qubits. That older system allowed the researchers to capture ultra-cold rubidium atoms and arrange them in a specific order using a one-dimensional array of individually focused laser beams called optical tweezers.

This new system allows the atoms to be assembled in two-dimensional arrays of optical tweezers. This increases the achievable system size from 51 to 256 qubits. Using the tweezers, researchers can arrange the atoms in defect-free patterns and create programmable shapes like square, honeycomb, or triangular lattices to engineer different interactions between the qubits.

"The workhorse of this new platform is a device called the spatial light modulator, which is used to shape an optical wavefront to produce hundreds of individually focused optical tweezer beams," said Ebadi. "These devices are essentially the same as what is used inside a computer projector to display images on a screen, but we have adapted them to be a critical component of our quantum simulator."

The initial loading of the atoms into the optical tweezers is random, and the researchers must move the atoms around to arrange them into their target geometries. The researchers use a second set of moving optical tweezers to drag the atoms to their desired locations, eliminating the initial randomness. Lasers give the researchers complete control over the positioning of the atomic qubits and their coherent quantum manipulation.

Other senior authors of the study include Harvard Professors Subir Sachdev and Markus Greiner, who worked on the project along with Massachusetts Institute of Technology Professor Vladan Vuletić, and scientists from Stanford, the University of California Berkeley, the University of Innsbruck in Austria, the Austrian Academy of Sciences, and QuEra Computing Inc. in Boston.

"Our work is part of a really intense, high-visibility global race to build bigger and better quantum computers," said Tout Wang, a research associate in physics at Harvard and one of the paper's authors. "The overall effort [beyond our own] has top academic research institutions involved and major private-sector investment from Google, IBM, Amazon, and many others."

The researchers are currently working to improve the system by improving laser control over qubits and making the system more programmable. They are also actively exploring how the system can be used for new applications, ranging from probing exotic forms of quantum matter to solving challenging real-world problems that can be naturally encoded on the qubits.

"This work enables a vast number of new scientific directions," Ebadi said. "We are nowhere near the limits of what can be done with these systems."

Japanese researcher introduces a new theoretical model of high-temperature superconductivity, in which electrical current can flow with zero resistance, that leads to extremely efficient energy generation, transmission

A scientist from the Division of Quantum Condensed Matter Physics at the University of Tsukuba in Japan has formulated a new theory of superconductivity. Based on the calculation of the "Berry connection," this model helps explain new experimental results better than the current theory. The work may allow future electrical grids to send energy without losses.

Superconductors are fascinating materials that may look unremarkable at ambient conditions, but when cooled to very low temperatures, allow electrical current to flow with zero resistance. There are several obvious applications of superconductivity, such as lossless energy transmission, but the physics underlying this process is still not clearly understood. The established way of thinking about the transition from normal to superconducting is called the Bardeen-Cooper-Schrieffer (BCS) theory. In this model, as long as thermal excitations are kept small enough, particles can form "Cooper pairs" which travel together and resist scattering. However, the BCS model does not adequately explain all types of superconductors, which limits our ability to create more robust superconducting materials that work at room temperature.

Now, a scientist from the University of Tsukuba has come up with a new model for superconductivity that better reveals the physical principles. Instead of focusing on the pairing of charged particles, this new theory uses the mathematical tool called the "Berry connection." This value computes a twisting of space where electrons travel. "In the standard BCS theory, the origin of superconductivity is electron pairing. In this theory, the supercurrent is identified as the dissipationless flow of the paired electrons, while single electrons still experience resistance," Author Professor Hiroyasu Koizumi says.

As an illustration, Josephson junctions are formed when two superconductor layers are separated by a thin barrier made of normal metal or an insulator. Although widely used in high-precision magnetic field detectors and quantum supercomputers, Josephson junctions also do not fit neatly the inside BCS theory. "In the new theory, the role of the electron pairing is to stabilize the Berry connection, as opposed to being the cause of superconductivity by itself, and the supercurrent is the flow of single and paired electrons generated due to the twisting of the space where electrons travel caused by the Berry connection," Professor Koizumi says. Thus, this research may lead to advancements in quantum supercomputing as well as energy conservation.