Moons may yield clues to what makes planets habitable

In the search for Earth-like planets, University of Rochester scientist Miki Nakajima turns to supercomputer simulations of moon formations.

Earth’s moon is vitally important in making Earth the planet we know today: the moon controls the length of the day and ocean tides, which affect the biological cycles of lifeforms on our planet. The moon also contributes to Earth’s climate by stabilizing Earth’s spin axis, offering an ideal environment for life to develop and evolve.

Because the moon is so important to life on Earth, scientists conjecture that a moon may be a potentially beneficial feature in harboring life on other planets. Most planets have moons, but Earth’s moon is distinct in that it is large compared to the size of Earth; the moon’s radius is larger than a quarter of Earth’s radius, a much larger ratio than most moons to their planets.

Miki Nakajima, an assistant professor of earth and environmental sciences at the University of Rochester, finds that distinction significant. And in a new study that she led, she and her colleagues at the Tokyo Institute of Technology and the University of Arizona examine moon formations and conclude that only certain types of planets can form moons that are large in respect to their host planets.

“By understanding moon formations, we have a better constraint on what to look for when searching for Earth-like planets,” Nakajima says. “We expect that exomoons [moons orbiting planets outside our solar system] should be everywhere, but so far we haven’t confirmed any. Our constraints will be helpful for future observations.”

The origin of Earth’s moon

Many scientists have historically believed Earth’s large moon was generated by a collision between proto-Earth—Earth at its early stages of development—and a large, Mars-sized impactor, approximately 4.5 billion years ago. The collision resulted in the formation of a partially vaporized disk around Earth, which eventually formed into the moon.

To find out whether other planets can form similarly large moons, Nakajima and her colleagues conducted impact simulations on the computer, with several hypothetical Earth-like rocky planets and icy planets of varying masses. They hoped to identify whether the simulated impacts would result in partially vaporized disks, like the disk that formed Earth’s moon.

The researchers found that rocky planets larger than six times the mass of Earth (6M) and icy planets larger than one Earth mass (1M) produce fully—rather than partially—vaporized disks, and these fully-vaporized disks are not capable of forming fractionally large moons.

“We found that if the planet is too massive, these impacts produce completely vapor disks because impacts between massive planets are generally more energetic than those between small planets,” Nakajima says.

After an impact that results in a vaporized disk, over time, the disk cools and liquid moonlets—a moon’s building blocks—emerge. In a fully-vaporized disk, the growing moonlets in the disk experience strong gas drag from vapor, falling onto the planet very quickly. In contrast, if the disk is only partially vaporized, moonlets do not feel such strong gas drag.

“As a result, we conclude that a complete vapor disk is not capable of forming fractionally large moons,” Nakajima says. “Planetary masses need to be smaller than those thresholds we identified to produce such moons.”

The search for Earth-like planets

The constraints outlined by Nakajima and her colleagues are important for astronomers investigating our universe; researchers have detected thousands of exoplanets and possible exomoons, but have yet to definitively spot a moon orbiting a planet outside our solar system.

This research may give them a better idea of where to look.

As Nakajima says: “The exoplanet search has typically been focused on planets larger than six earth masses. We are proposing that instead, we should look at smaller planets because they are probably better candidates to host fractionally large moons.”

QuSoft, IoP researchers propose a new method for quantum supercomputing in trapped ions

Physicists from the University of Amsterdam have proposed a new architecture for a scalable quantum supercomputer. Making use of the collective motion of the constituent particles, they were able to construct new building blocks for quantum supercomputing that pose fewer technical difficulties than current state-of-the-art methods. The results were recently published in Physical Review Letters. Two trapped ions (in blue) are selected by optical tweezers (in red). A quantum gate between the ions can be implemented using electric fields.

The researchers work at QuSoft and the Institute of Physics in the groups of Rene Gerritsma and Arghavan Safavi-Naini. The effort, which was led by the Ph.D. candidate Matteo Mazzanti, combines two important ingredients. One is a so-called trapped-ion platform, one of the most promising candidates for quantum supercomputing that makes use of ions – atoms that have either a surplus or a shortage of electrons and as a result are electrically charged. The other is the use of a clever method to control the ions supplied by optical tweezers and oscillating electric fields.

As the name suggests, trapped-ion quantum computers use a crystal of trapped ions. These ions can move individually, but more importantly, also as a whole. As it turns out, the possible collective motions of the ions facilitate the interactions between individual pairs of ions. In the proposal, this idea is made concrete by applying a uniform electric field to the whole crystal, to mediate interactions between two specific ions in that crystal. The two ions are selected by applying tweezer potentials on them – see the image above. The homogeneity of the electric field assures that it will only allow the two ions to move together with all other ions in the crystal. As a result, the interaction strength between the two selected ions is fixed, regardless of how far apart the two ions are.

A quantum computer consists of ‘gates’, small computational building blocks that perform quantum analogs of operations like ‘and’ and ‘or’ that we know from ordinary computers. In trapped-ion quantum computers, these gates act on the ions, and their operation depends on the interactions between these particles. In the above setup, the fact that those interactions do not depend on the distance means that also the duration of operation of a gate is independent of that distance. As a result, this scheme for quantum computing is inherently scalable, and compared to other state-of-the-art quantum computing schemes poses fewer technical challenges for achieving comparably well-operating quantum supercomputers.

Rice chemists build machine learning that fine-tunes flash graphene

Rice University scientists are using machine-learning techniques to streamline the process of synthesizing graphene from waste through flash Joule heating.  Machine learning is fine-tuning Rice University’s flash Joule heating method for making graphene from a variety of carbon sources, including waste materials.  CREDIT Jacob Beckham/Rice University

The process discovered two years ago by the Rice lab of chemist James Tour has expanded beyond making graphene from various carbon sources to extracting other materials like metals from urban waste, with the promise of more environmentally friendly recycling to come. 

The technique is the same for all of the above: blasting a jolt of high energy through the source material to eliminate all but the desired product. But the details for flashing each feedstock are different. 

The researchers describe in Advanced Materials how machine-learning models that adapt to variables and show them how to optimize procedures are helping them push forward.

“Machine-learning algorithms will be critical to making the flash process rapid and scalable without negatively affecting the graphene product’s properties,” Tour said.  

“In the coming years, the flash parameters can vary depending on the feedstock, whether it’s petroleum-based, coal, plastic, household waste, or anything else,” he said. “Depending on the type of graphene we want -- small flake, large flake, high turbostratic, level of purity -- the machine can discern by itself what parameters to change.”

Because flashing makes graphene in hundreds of milliseconds, it’s difficult to tease out the details of the chemical process. So Tour and company took a clue from materials scientists who have worked machine learning into their everyday process of discovery.

“It turned out that machine learning and flash Joule heating had really good synergy,” said Rice graduate student and lead author Jacob Beckham. “Flash Joule heating is a really powerful technique, but it’s difficult to control some of the variables involved, like the rate of current discharge during a reaction. And that’s where machine learning can shine. It’s a great tool for finding relationships between multiple variables, even when it’s impossible to do a complete search of the parameter space. Rice University chemists are employing machine learning to fine-tune its flash Joule heating process to make graphene.  CREDIT Jeff Fitlow/Rice University

“That synergy made it possible to synthesize graphene from scrap material based entirely on the models’ understanding of the Joule heating process,” he said. “All we had to do was carry out the reaction -- which can eventually be automated.”

The lab used its custom optimization model to improve graphene crystallization from four starting materials -- carbon black, plastic pyrolysis ash, pyrolyzed rubber tires, and coke -- over 173 trials, using Raman spectroscopy to characterize the starting materials and graphene products. 

The researchers then fed more than 20,000 spectroscopy results to the model and asked it to predict which starting materials would provide the best yield of graphene. The model also took the effects of charge density, sample mass, and material type into account in their calculations. 

Co-authors are Rice graduate students Kevin Wyss, Emily McHugh, Paul Advincula, and Weiyin Chen; Rice alumnus John Li; and postdoctoral researchers Yunchao Xie and Jian Lin, an associate professor of mechanical and aerospace engineering, of the University of Missouri. Tour is the T.T. and W.F. Chao Chair in Chemistry as well as a professor of computer science and materials science and nanoengineering.