Kobe University's impact crater data rendered onto supercomputer model of Ryugu asteroid illuminates a complicated geological history

Analysis of the impact craters on Ryugu using the spacecraft Hayabusa 2’s remote sensing image data has illuminated the geological history of the Near-Earth asteroid.

A research group led by Assistant Professor Naoyuki Hirata of the Department of Planetology at Kobe University’s Graduate School of Science revealed 77 craters on Ryugu. Through analyzing the location patterns and characteristics of the craters, they determined that the asteroid’s eastern and western hemispheres were formed at different periods of time.

It is hoped that the collected data can be used as a basis for future asteroid research and analysis.

These results were first published in the American Scientific Journal ‘Icarus’ on November 5, 2019.

The Japan Space Agency (JAXA)’s Hayabusa 2 has been used to carry out various missions to increase our understanding of the spinning top-shaped, Near-Earth asteroid Ryugu. Since arriving in June 2018, the unmanned spacecraft has taken samples and a great number of images of the asteroid. It is hoped that these can reveal more about Ryugu’s formation and history.

This research group focused on using the image data to determine the number and location of impact craters on the asteroid. Impact craters are formed when a smaller asteroid or a comet hits the surface of the asteroid. Analyzing the spatial distribution and the number of impact craters can reveal the frequency of collisions and aid researchers in determining the age of different surface areas. Figure 1: Size and location of craters on Ryugu (Figure from the Journal paper): The craters are numbered in order of size.{module INSIDE STORY}

Research Methodology

First of all, the image data from Hayabusa 2 was analyzed. Hayabusa 2 has many different types of camera including Optical Navigation Cameras (ONC). The ONC team has been able to take around 5000 images of Ryugu, which have revealed many surface features- including impact craters. For this study, image data obtained from the ‘ONC-T’ camera between July 2018 and February 2019 was utilized. The research group had to determine which of these images showed craters. 340 images were used for crater counting, with stereo-pair images making it easier to identify the craters. A global image mosaic map was constructed from the ONC images and rendered onto the supercomputer model of Ryugu’s shape. Small Body Mapping Tool software was then used to measure the size, latitude, and longitude of the craters. A LiDAR (Light Detection and Ranging pulsed laser) was also utilized to determine the overall size of Ryugu.

The depressions identified on Ryugu were divided into four categories- depending on how evident their circular appearance was. Category I to III depressions were classified as distinct craters. Category IV depressions only had quasi-circular features, therefore it was hard to determine whether they were craters or not. Many craters were filled with boulders or lacked a distinct shape. Depressions that were too vague to determine were left out of the results.

Research Results

The research team was able to identify all impact craters over 10 to 20m in diameter on Ryugu’s entire surface- a total of 77 craters. Furthermore, a pattern was discovered in their distribution. The section of the eastern hemisphere near the meridian was found to have the most craters. This is the area near the large crater named Cendrillon - which is one of Ryugu’s biggest. In contrast, there are hardly any craters in the western hemisphere- suggesting that this part of the asteroid was formed later. The analysis also revealed that there are more craters at lower latitudes than at higher latitudes on Ryugu. In other words, there are very few craters in Ryugu’s polar regions.

The equatorial ridge in the eastern hemisphere was determined to be a fossil structure. When asteroids like Ryugu rotate at high speeds, this can alter their shape. It is thought that this ridge formed in the distant past during a period when it only took Ryugu 3 hours to rotate. As the eastern hemisphere and western hemisphere were formed at different periods of the asteroid’s history- this suggests that there have been at least two instances where Ryugu’s rotational speed has increased.

Further Research

The results of this study were compiled into a global impact crater catalog for Ryugu. It is hoped that this database can be used as a basis for future research and that comparing these results with those of a similar asteroid will lead to a greater understanding of these astronomical objects.

Hayabusa2 is scheduled to drop the capsule containing samples of Ryugu’s surface into Earth’s atmosphere in late 2020. Analysis of these samples should provide further insight into the asteroid and how it was formed.

University of Arizona scientists outline 10 simple rules for the computational modeling of behavioral data

The guidelines are designed to help researchers avoid many potential pitfalls in the computational modeling of cognitive and neuroscience data

New guidelines for scientists who use computational modeling to analyze behavioral data have been published today in the open-access journal eLife.

The goal of computational modeling in the behavioral sciences is to use precise mathematical models to make better sense of data concerning behaviors. These data often come in the form of choices, but can also include reaction times, eye movements and other behaviors that are easy to observe, and even neural data. The mathematical models consist of equations that link the variables behind the data, such as stimuli and past experiences, to behavior in the immediate future. In this way, computational models provide a kind of hypothesis about how behavior is generated.

"Using computers to simulate and study behavior has revolutionized psychology and neuroscience research," explains co-author Robert Wilson, Assistant Professor in Cognition/Neural Systems and Director of the Neuroscience of Reinforcement Learning Lab at the University of Arizona, US. "Fitting computational models to experimental data allows us to achieve a number of objectives, which can include probing the algorithms underlying behavior and better understanding the effects of drugs, illnesses, and interventions." {module INSIDE STORY}

There are four key uses of computational modeling across the scientific literature, according to Wilson and his co-author Anne Collins, Principal Investigator at the Computational Cognitive Neuroscience (CCN) Lab, part of the Department of Psychology and the Helen Wills Neuroscience Institute at the University of California, Berkeley, US. Each of these practices has its own strengths and weaknesses and can be mishandled in a number of ways, potentially leading to incorrect and misleading conclusions and highlighting the need for them to be carried out responsibly.

To address this need, Wilson and Collins offer their 10 simple rules, designed for both beginners and seasoned researchers, to ensure that computational modeling is used with care and yields meaningful insights on what a model is saying about the mind.

Their rules encompass a number of principles that include: designing effective experiments with computational modeling in mind; generating, simulating, comparing and validating models; extracting variables from models to compare with physiological data; reporting on the analyses; and, finally, advice on the next steps once the reporting is completed.

While these guidelines cover the simplest modeling techniques that can be used by beginners, they are also applicable more generally. Likewise, for clarity, the authors decided to focus on a single narrow domain - reinforcement learning models applied to choice data - as the same techniques used in this domain can be applied more widely to other observable behaviors.

"Our work highlights how to avoid common pitfalls and misinterpretations that can arise with computational modeling," Collins explains. "We learned many of these lessons the hard way, by actually making these mistakes for ourselves over a combined 20-plus years in the field.

"By following these guidelines, we hope other scientists will avoid some of the errors that slowed down our own research," she adds. "We would also hope to start seeing improvements in the quality of computational modeling in the behavioral sciences."

University of São Paulo reseacher presents supercomputer simulations for understanding the transport of aerosols at FAPESP Week France

A study produced in Brazil and presented during FAPESP Week France aims at elucidating the behavior of the so-called aerosols, which have an important influence on climate, agriculture, and human health.

One of the speakers at FAPESP Week France, held in Lyon and Paris until November 27th, Livia Freire, a researcher at the Institute of Mathematical and Computing Sciences of the University of São Paulo (ICMC-USP), has been developing supercomputer simulations to try to broaden the knowledge on the transport of aerosols.

"Our interest lies in knowing how these particles are transported by atmospheric flows, which are very complicated movements to simulate and understand because they are turbulent. We are developing numerical models that simulate flows in the atmosphere and how they transport particles. The aim is to obtain simple equations that researchers in other areas can use to understand particle concentrations in the atmosphere," she said. 

The tiny solids or liquids suspended in the air are called particulate matter. This material is only micrometers (thousandths of millimeters) in size. Particulate matter in the air, whether in the form of dust, fog, or smoke, is called aerosol. The sources of aerosols can be natural or anthropogenic, such as pollution. The smoke that leaves truck exhausts, factory chimneys, or fires in fields or forests is easily visible. Similarly, sand storms can be observed from kilometers away. However, the compounds in these two types of formations usually go unnoticed, and a large amount of the particles that circulate everywhere are unseen. {module INSIDE STORY}

These suspended microscopic particles play an important role in climate and rains and can affect human health. Because of this, they are the object of studies by scientists in various countries, and Brazil is at the front line of this research, with projects such as GoAmazon, with São Paulo Research Foundation - FAPESP's support (read more at agencia.fapesp.br/29665).

"I'm developing a study funded by FAPESP whose aim is to understand the behavior of the particles that circulate in the atmosphere. These are various types of tiny particles, which we cannot see but which can have a significant impact on our lives, on health, on agriculture, on climate," she said.

The researcher explained that to predict the behavior of particles, such as their concentration at a particular time and location, it is necessary to understand the turbulent flows present in the region that corresponds to the first hundreds of meters of the atmosphere, known as the atmospheric boundary layer. "This is a region that concentrates all the energy, gas, and particle exchanges between the atmosphere and the elements that compose the planet's surface," she said.

"The problem of turbulent flows is very complex since it involves various scales, ranging from the scale of the atmosphere itself to other very small ones, such as the turbulent vortices that transport particles. To simulate that on a computer, we need to represent all these scales, which means a significant increase in computing costs. It is a big challenge to represent all the different components in the atmosphere in a [supercomputer] system with a viable cost," said Freire.

Large-Eddy Simulation

The researcher mentioned that the atmospheric boundary layer has turbulent flows whose most faithful computational representation is obtained using a technique called Large-Eddy Simulation (LES).

"Due to its complex nature, the study of turbulence is based on the use of numerical simulations combined with experimental data analysis. For atmospheric flows, the use of the LES technique provides important indicators regarding the unique behavior of turbulence, and significant progress has been made in developing models for material and energy transport in the atmosphere in simplified conditions," she said.

"For example, the average concentration of fine particles emitted from a flat and naked soil region can be represented by a simple flow-profile ratio, a result found based on the use of LES," she said.

According to Freire, advances in computational power capacities offer an opportunity to investigate more complex problems, such as particle transport in the presence of forests and cities.

"We are using LES, an advanced numerical tool, to develop new models that enable us to explain the transport of particles in the atmosphere. This can increase our understanding and our ability to predict their role in the environment," she said.

In the research with LES, Freire has also been working with professors Leandro Franco de Souza, from ICMC-USP, and Amauri Pereira de Oliveira from USP's Institute of Astronomy, Geophysics, and Atmospheric Sciences, and with David Richter, from the University of Notre Dame.