ALMA discovers earliest gigantic black hole storm

Researchers using the Atacama Large Millimeter/submillimeter Array (ALMA) discovered a titanic galactic wind driven by a supermassive black hole 13.1 billion years ago. This is the earliest-yet-observed example of such wind to date and is a telltale sign that huge black holes have a profound effect on the growth of galaxies from the very early history of the Universe.

At the center of many large galaxies hides a supermassive black hole that is millions to billions of times more massive than the Sun. Interestingly, the mass of the black hole is roughly proportional to the mass of the central region (bulge) of the galaxy in the nearby Universe. At first glance, this may seem obvious, but it is actually very strange. The reason is that the sizes of galaxies and black holes differ by about ten orders of magnitude. Based on this proportional relationship between the masses of two objects that are so different in size, astronomers believe that galaxies and black holes grew and evolved together (coevolution) through some kind of physical interaction.

A galactic wind can provide this kind of physical interaction between black holes and galaxies. A supermassive black hole swallows a large amount of matter. As that matter begins to move at high speed due to the black hole’s gravity, it emits intense energy, which can push the surrounding matter outward. This is how the galactic wind is created. ALMA image of the distant galaxy J1243+0100 hosting a supermassive black hole in its center. The distribution of the quiet gas in the galaxy is shown in yellow, and the distribution of high-speed galactic wind is shown in blue. The wind is located in the galaxy center, which indicates the supermassive black hole drives the wind. Credit: ALMA (ESO/NAOJ/NRAO), Izumi et al.

“The question is when did galactic winds come into existence in the Universe?” says Takuma Izumi, the lead author of the research paper and a researcher at the National Astronomical Observatory of Japan (NAOJ). “This is an important question because it is related to an important problem in astronomy: How did galaxies and supermassive black holes coevolve?”

The research team first used NAOJ’s Subaru Telescope to search for supermassive black holes. Thanks to its wide-field observation capability, they found more than 100 galaxies with supermassive black holes in the Universe more than 13 billion years ago.

Then, the research team utilized ALMA’s high sensitivity to investigate the gas motion in the host galaxies of the black holes. ALMA observed a galaxy HSC J124353.93+010038.5 (hereafter J1243+0100), discovered by the Subaru Telescope, and captured radio waves emitted by the dust and carbon ions in the galaxy.

Detailed analysis of the ALMA data revealed that there is a high-speed gas flow moving at 500 km per second in J1243+0100. This gas flow has enough energy to push away the stellar material in the galaxy and stop the star formation activity. The gas flow found in this study is truly a galactic wind, and it is the oldest observed example of a galaxy with a huge wind of galactic size. The previous record-holder was a galaxy about 13 billion years ago, so this observation pushes the start back another 100 million years.

The team also measured the motion of the quiet gas in J1243+0100 and estimated the mass of the galaxy’s bulge, based on its gravitational balance, to be about 30 billion times that of the Sun. The mass of the galaxy’s supermassive black hole, estimated by another method, was about 1% of that. The mass ratio of the bulge to the supermassive black hole in this galaxy is almost identical to the mass ratio of black holes to galaxies in the modern Universe. This implies that the coevolution of supermassive black holes and galaxies has been occurring since less than a billion years after the birth of the Universe.

“Our observations support recent high-precision computer simulations which have predicted that coevolutionary relationships were in place even at about 13 billion years ago,” comments Izumi. “We are planning to observe a large number of such objects in the future and hope to clarify whether or not the primordial coevolution seen in this object is an accurate picture of the general Universe at that time.”

UK astronomers demo the value of supercomputing the fates of stars

  • Astronomers from the University of Warwick and the University of Exeter use supercomputer to model the future of unusual planetary system found a solar system of planets that will 'pinball' off one another
  • Today, the system consists of four massive planets locked in a perfect rhythm
  • Study shows that this perfect rhythm is likely to hold for 3 billion years - but the death of its sun will cause a chain reaction and set the interplanetary pinball game in motion

Four planets locked in a perfect rhythm around a nearby star are destined to be pinballed around their solar system when their sun eventually dies, according to a study led by the University of Warwick that peers into its future.

Astronomers have modeled how the change in gravitational forces in the system resulting from the star becoming a white dwarf will cause its planets to fly loose from their orbits and bounce off each other's gravity, like balls bouncing off a bumper in a game of pinball.

In the process, they will knock nearby debris into their dying sun, offering scientists new insight into how the white dwarfs with polluted atmospheres that we see today originally evolved. Artist's impression of the four planets of the HR 8799 system and its star.  CREDIT University of Warwick/Mark Garlick

The HR 8799 system is 135 light-years away and comprises a 30-40 million-year-old A-type star and four unusually massive planets, all over five times the mass of Jupiter, orbiting very close to each other. The system also contains two debris discs, inside the orbit of the innermost planet and another outside the outermost. Recent research has shown that the four planets are locked in a perfect rhythm that sees each one completing double the orbit of its neighbor: so for every orbit the furthest completes, the next closest completes two, the next complete four, while the closest completes eight.

The team from Warwick and Exeter decided to learn the ultimate fate of the system by creating a model that allowed them to play 'planetary pinball' with the planets, investigating what may cause the perfect rhythm to destabilize.

They determined that the resonance that locks the four planets is likely to hold firm for the next 3 billion years, despite the effects of Galactic tides and close flybys of other stars. However, it always breaks once the star enters the phase in which it becomes a red giant, when it will expand to several hundred times its current size and eject nearly half its mass, ending up as a white dwarf.

The planets will then start to pinball and become a highly chaotic system where their movements become very uncertain. Even changing a planet's position by a centimeter at the start of the process can dramatically change the outcome.

Lead author Dr. Dimitri Veras from the University of Warwick Department of Physics said: "The planets will gravitationally scatter off of one another. In one case, the innermost planet could be ejected from the system. Or, in another case, the third planet may be ejected. Or the second and fourth planets could switch positions. Any combination is possible just with little tweaks.

"They are so big and so close to each other the only thing that's keeping them in this perfect rhythm right now is the locations of their orbits. All four are connected in this chain. As soon as the star loses mass their locations will deviate, then two of them will scatter off one another, causing a chain reaction amongst all four."

Dr. Veras was supported by an Ernest Rutherford Fellowship from the Science and Technology Facilities Council, part of UK Research and Innovation.

Regardless of the precise movements of the planets, one thing that the team is certain of is that the planets will move around enough to dislodge material from the system's debris discs into the atmosphere of the star. It is this type of debris that astronomers are analyzing today to discover the histories of other white dwarf systems.

Dr. Veras adds: "These planets move around the white dwarf at different locations and can easily kick whatever debris is still there into the white dwarf, polluting it.

"The HR 8799 planetary system represents a foretaste of the polluted white dwarf systems that we see today. It's a demonstration of the value of computing the fates of planetary systems, rather than just looking at their formation."

Co-author Professor Sasha Hinkley of the University of Exeter said: "The HR 8799 system has been so iconic for exoplanetary science since its discovery nearly 13 years ago, and so it is fascinating to see into the future and watch it evolve from a harmonious collection of planets into a chaotic scene."

NASA's LHASA 2.0 uses a machine learning model to better represent landslide hazards globally

Every year, landslides - the movement of rock, soil, and debris down a slope - cause thousands of deaths, billions of dollars in damages, and disruptions to roads and power lines. Because terrain, characteristics of the rocks and soil, weather, and climate all contribute to landslide activity, accurately pinpointing areas most at risk of these hazards at any given time can be a challenge. Early warning systems are generally regional - based on region-specific data provided by ground sensors, field observations, and rainfall totals. But what if we could identify at-risk areas anywhere in the world at any time?

Enter NASA's Global Landslide Hazard Assessment (LHASA) model and mapping tool.

LHASA Version 2, released last month along with corresponding research, is a machine-learning-based model that analyzes a collection of individual variables and satellite-derived datasets to produce customizable "nowcasts." These timely and targeted nowcasts are estimates of potential landslide activity in near-real-time for each 1-square-kilometer area between the poles. The model factors in the slope of the land (higher slopes are more prone to landslides), distance to geologic faults, the makeup of rock, past and present rainfall, and satellite-derived soil moisture and snow mass data. Image shows a map of potential landslide risk output by NASA's Landslide Hazard Assessment Model (LHASA) in June 2021. Red indicates the highest risk and dark blue indicates the lowest risk.

"The model processes all of this data and outputs a probabilistic estimate of landslide hazard in the form of an interactive map," said Thomas Stanley, Universities Space Research Association scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, who led the research. "This is valuable because it provides a relative scale of landslide hazard, rather than just saying there is or is not landslide risk. Users can define their area of interest and adjust the categories and probability threshold to suit their needs."

To "teach" the model, researchers input a table with all of the relevant landslide variables and many locations that have recorded landslides in the past. The machine learning algorithm takes the table and tests out different possible scenarios and outcomes, and when it finds the one that fits the data most accurately, it outputs a decision tree. It then identifies the errors in the decision tree and calculates another tree that fixes those errors. This process continues until the model has "learned" and improved 300 times.

"The result is that this version of the model is roughly twice as accurate as the first version of the model, making it the most accurate global nowcasting tool available," said Stanley. "While the accuracy is highest - often 100% - for major landslide events triggered by tropical cyclones, it improved significantly across all inventories."

Version 1, released in 2018, was not a machine learning model. It combined satellite precipitation data with a global landslide susceptibility map to produce its nowcasts. It made its predictions using one decision tree largely based on rainfall data from the preceding week and categorized each grid cell as low, moderate, or high risk. This image shows a landslide "nowcast" for Nov. 18, 2020 during the passage of Hurricane Iota through Nicaragua and Honduras.

"In this new version, we have 300 trees of better and better information compared with the first version, which was based on just one decision tree," Stanley said. "Version 2 also incorporates more variables than its predecessor including soil moisture and snow mass data."

Generally speaking, soil can only absorb so much water before becoming saturated and, combined with other conditions, posing a landslide risk. By incorporating soil moisture data, the model can discern how much water is already present in the soil and how much additional rainfall would push it past that threshold. Likewise, if the model knows the amount of snow present in a given area, it can factor in the additional water entering the soil as the snow melts. This data comes from the Soil Moisture Active Passive (SMAP) satellite, which is managed by NASA's Jet Propulsion Laboratory in Southern California. It launched in 2015 and provides continuous soil moisture coverage.

LHASA Version 2 also adds a new exposure feature that analyzes the distribution of roads and population in each grid cell to calculate the number of people or infrastructure exposed to landslide hazards. The exposure data is downloadable and has been integrated into the interactive map. Adding this type of information about exposed roads and populations vulnerable to landslides helps improve situational awareness and actions by stakeholders from international organizations to local officials.

Building on years of research and applications, LHASA Version 2 was tested by the NASA Disasters program and stakeholders in real-world situations leading up to its formal release. In November 2020, when hurricanes Eta and Iota struck Central America within a span of two weeks, researchers working with NASA's Earth Applied Sciences Disasters program used LHASA Version 2 to generate maps of predicted landslide hazard for Guatemala and Honduras. The researchers overlaid the model with district-level population data so they could better assess the proximity between potential hazards and densely populated communities. Disasters program coordinators shared the information with national and international emergency response agencies to provide better insight into the hazards to personnel on the ground.

While it is a useful tool for planning and risk mitigation purposes, Stanley says the model is meant to be used with a global perspective in mind rather than as a local emergency warning system for any specific area. However, future research may expand that goal.

"We are working on incorporating a precipitation forecast into LHASA Version 2, and we hope it will provide further information for advanced planning and actions prior to major rainfall events," said Stanley. One challenge, Stanley notes, is obtaining a long enough archive of forecasted precipitation data from which the model can learn.

In the meantime, governments, relief agencies, emergency responders, and other stakeholders (as well as the general public) have access to a powerful risk assessment tool in LHASA Version 2.