American Museum of Natural History astrophysicist Shara shines new light on sleeping cataclysmic binaries

The new simulation-based study supports contested 35-year-old predictions, shows that observable novae are just "tip of the iceberg"

Almost 35 years ago, scientists made the then-radical proposal that colossal hydrogen bombs called novae go through a very long-term life cycle after erupting, fading to obscurity for hundreds of thousands of years and then building back up to become full-fledged novae once more. A new study is the first to fully model the work and incorporate all of the feedback factors now known to control these systems, backing up the original prediction while bringing new details to light. Published this week, the study confirms that the novae we observe flashing throughout the universe represent just a few percents of these cataclysmic variables, as they are known, with the rest "hiding" in hibernation.

"We've now quantified the suggestion from decades ago that most of these systems are deeply hibernating, waiting to wake up, and we haven't yet identified them," said Michael Shara, a curator in the Department of Astrophysics at the American Museum of Natural History who was the lead author on the original study and is one of the coauthors of the new work. "The novae we observe are just the tip of the iceberg. We've been wrong in thinking that the novalike binaries and dwarf novae that make novae represent everything out there. The systems that make novae are much more common than we've thought."

Cataclysmic binary systems occur when a star like our Sun--a red dwarf--is being cannibalized by a white dwarf, a dead star. The white dwarf builds up a critical layer of hydrogen that it steals from the red dwarf, and that hydrogen explodes as a gigantic bomb. This explosion produces a burst of light that makes the white dwarf star up to 1 million times brighter than the Sun for a period of time that can last from a few days to a few months.

Shara's original work proposed that, after an eruption, a nova becomes "nova-like," then a dwarf nova, and then, after hibernation as a so-called detached binary, it comes back to being a dwarf nova, nova-like, and then a nova, repeating the cycle over and over again, up to 100,000 times over billions of years. "In the same way that an egg, a caterpillar, a pupa, and a butterfly are all life stages of the same organism, these binaries are all the same objects seen at different phases of their lives," Shara said.

For the new study, Shara and his colleagues at Ariel University and Tel-Aviv University in Israel built a set of supercomputer simulations to follow thousands of novae eruptions and their effects on their red dwarf companions. The goal is to show, quantitatively, that the evolution of cataclysmic binary systems is cyclical and driven by feedback between the two stars. {module INSIDE STORY}

"There just wasn't the computing power to do this 30 years ago, or 20 years ago, or even 10 years ago," Shara said.

They found that cataclysmic binaries do not simply alternate through each of the four states--nova, nova-like, dwarf nova, and detached binary--their whole lives. Newborn binaries, during the first few percents of a system's life, only alternate between nova and nova-like states. Then, for the next 10 percent of their lifetimes, the binaries alternate through three states: nova, nova-like, and dwarf nova. For the remaining 90 percent of their lifetimes, they continuously cycle through all four states.

Further, the study showed that almost all the novae we observe today occur near the beginning of a binary system's life as opposed to the end--at a rate of about once every 10,000 years rather than once every few million years.

"Statistically, that means that the systems we observe--the ones that are popping off all of the time--are the newborn ones," Shara said. "And that's just about 5 percent of the total binaries out there. The vast majority are in the detached state, and we've been ignoring them because they're so faint and common. We know that they're there. Now we just have to work hard to find them and connect them to novae."

Goddard's new 3D view of methane tracks sources, movement around the globe

NASA’s new 3-dimensional portrait of methane concentrations shows the world’s second-largest contributor to greenhouse warming, the diversity of sources on the ground, and the behavior of the gas as it moves through the atmosphere. Combining multiple data sets from emissions inventories, including fossil fuel, agricultural, biomass burning and biofuels, and simulations of wetland sources into a high-resolution supercomputer model, researchers now have an additional tool for understanding this complex gas and its role in Earth’s carbon cycle, atmospheric composition, and climate system.

Since the Industrial Revolution, methane concentrations in the atmosphere have more than doubled. After carbon dioxide, methane is the second most influential greenhouse gas, responsible for 20 to 30% of Earth’s rising temperatures to date.

“There’s an urgency in understanding where the sources are coming from so that we can be better prepared to mitigate methane emissions where there are opportunities to do so,” said research scientist Ben Poulter at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. NASA’s new 3-dimensional portrait of methane shows the world’s second largest contributor to greenhouse warming as it travels through the atmosphere. Combining multiple data sets from emissions inventories and simulations of wetlands into a high-resolution computer model, researchers now have an additional tool for understanding this complex gas and its role in Earth’s carbon cycle, atmospheric composition, and climate system. The new data visualization builds a fuller picture of the diversity of methane sources on the ground as well as the behavior of the gas as it moves through the atmosphere. Credits: NASA/Scientific Visualization Studio{module INSIDE STORY}

A single molecule of methane is more efficient at trapping heat than a molecule of carbon dioxide, but because the lifetime of methane in the atmosphere is shorter and carbon dioxide concentrations are much higher, carbon dioxide still remains the main contributor to climate change. Methane also has many more sources than carbon dioxide, these include the energy and agricultural sectors, as well as natural sources from various types of wetlands and water bodies.

“Methane is a gas that’s produced under anaerobic conditions, so that means when there’s no oxygen available, you’ll likely find methane being produced,” said Poulter. In addition to fossil fuel activities, primarily from the coal, oil and gas sectors, sources of methane also include the ocean, flooded soils in vegetated wetlands along rivers and lakes, agriculture, such as rice cultivation, and the stomachs of ruminant livestock, including cattle.

“It is estimated that up to 60% of the current methane flux from land to the atmosphere is the result of human activities,” said Abhishek Chatterjee, a carbon cycle scientist with Universities Space Research Association based at Goddard. “Similar to carbon dioxide, human activity over long time periods is increasing atmospheric methane concentrations faster than the removal from natural ‘sinks’ can offset it. As human populations continue to grow, changes in energy use, agriculture, and rice cultivation, livestock raising will influence methane emissions. However, it’s difficult to predict future trends due to both lack of measurements and incomplete understanding of the carbon-climate feedbacks.” 

{media id=240,layout=solo}

Researchers are using supercomputer models to try to build a more complete picture of methane, said research meteorologist Lesley Ott with the Global Modeling and Assimilation Office at Goddard. “We have pieces that tell us about the emissions, we have pieces that tell us something about the atmospheric concentrations, and the models are basically the missing piece tying all that together and helping us understand where the methane is coming from and where it’s going.”

To create a global picture of methane, Ott, Chatterjee, Poulter and their colleagues used methane data from emissions inventories reported by countries, NASA field campaigns, like the Arctic Boreal Vulnerability Experiment (ABoVE) and observations from the Japanese Space Agency’s Greenhouse Gases Observing Satellite (GOSAT Ibuki)  and the Tropospheric Monitoring Instrument aboard the European Space Agency’s Sentinel-5P satellite. They combined the data sets with a supercomputer model that estimates methane emissions based on known processes for certain land-cover types, such as wetlands. The model also simulates the atmospheric chemistry that breaks down methane and removes it from the air. Then they used a weather model to see how methane traveled and behaved over time while in the atmosphere.

The data visualization of their results shows methane’s ethereal movements and illuminates its complexities both in space over various landscapes and with the seasons. Once methane emissions are lofted up into the atmosphere, high-altitude winds can transport it far beyond their sources.

When they first saw the data visualized, several locations stood out.

In South America, the Amazon River basin and its adjacent wetlands flood seasonally, creating an oxygen-deprived environment that is a significant source of methane. Globally, about 60% of methane emissions come from the tropics, so it’s important to understand the various human and natural sources, said Poulter.

Over Europe, the methane signal is not as strong as over the Amazon. European methane sources are influenced by the human population and the exploration and transport of oil, gas, and coal from the energy sector.

In India, rice cultivation and livestock are the two driving sources of methane. “Agriculture is responsible for about 20% of global methane emissions and includes enteric fermentation, which is the processing of food in the guts of cattle, mainly, but also includes how we manage the waste products that come from livestock and other agricultural activities,” said Poulter.

China’s economic expansion and large population drive the high demand for oil, gas and coal exploration for the industry as well as agriculture production, which are its underlying sources of methane.

The Arctic and high-latitude regions are responsible for about 20% of global methane emissions. “What happens in the Arctic, doesn’t always stay in the Arctic,” Ott said. “There’s a massive amount of carbon that’s stored in the northern high latitudes. One of the things scientists are really concerned about is whether or not, as the soils warm, more of that carbon could be released to the atmosphere. Right now, what you’re seeing in this visualization is not very strong pulses of methane, but we’re watching that very closely because we know that’s a place that is changing rapidly and that could change dramatically over time.”

“One of the challenges with understanding the global methane budget has been to reconcile the atmospheric perspective on where we think methane is being produced versus the bottom-up perspective, or how we use country-level reporting or land surface models to estimate methane emissions,” said Poulter. “The visualization that we have here can help us understand this top-down and bottom-up discrepancy and help us also reduce the uncertainties in our understanding of the global methane budget by giving us visual cues and a qualitative understanding of how methane moves around the atmosphere and where it’s produced.”

The model data of methane sources and transport will also help in the planning of both future field and satellite missions. Currently, NASA has a planned satellite called GeoCarb that will launch around 2023 to provide geostationary space-based observations of methane in the atmosphere over much of the western hemisphere.

 

Intel chips are still vulnerable to attacks

Computer scientists at KU Leuven have once again exposed a security flaw in Intel processors. Jo Van Bulck, Frank Piessens, and their colleagues in Austria, the United States, and Australia gave the manufacturer one year to fix the problem.

Plundervolt, Zombieload, Foreshadow: in the past couple of years, Intel has had to issue quite a few patches for vulnerabilities that computer scientists at KU Leuven have helped to expose. "All measures that Intel has taken so far to boost the security of its processors have been necessary, but they were not enough to ward off our new attack," says Jo Van Bulck from the Department of Computer Science at KU Leuven. The Load Value Injection attack on Intel processors uses the vulnerability of SGX enclaves to smuggle or 'inject' attacker-controlled data into a software program that the victim is running on their computer.{module INSIDE STORY}

Like the previous attacks, the new technique - dubbed Load Value Injection - targets the 'vault' of computer systems with Intel processors: SGX enclaves (see below).

"To a certain extent, this attack picks up where our Foreshadow attack of 2018 left off. A particularly dangerous version of this attack exploited the vulnerability of SGX enclaves, so that the victim's passwords, medical information, or other sensitive information was leaked to the attacker. Load Value Injection uses that same vulnerability, but in the opposite direction: the attacker's data are smuggled - 'injected' - into a software program that the victim is running on their computer. Once that is done, the attacker can take over the entire program and acquire sensitive information, such as the victim's fingerprints or passwords."

The vulnerability was already discovered on 4 April 2019. Nevertheless, the researchers and Intel agreed to keep it a secret for almost a year. Responsible disclosure embargoes are not unusual when it comes to cybersecurity, although they usually lift after a shorter time. "We wanted to give Intel enough time to fix the problem. In certain scenarios, the vulnerability we exposed is very dangerous and extremely difficult to deal with because, this time, the problem did not just pertain to the hardware: the solution also had to take software into account. Therefore, hardware updates like the ones issued to resolve the previous flaws were no longer enough. This is why we agreed upon an exceptionally long embargo period with the manufacturer."

"Intel ended up taking extensive measures that force the developers of SGX enclave software to update their applications. However, Intel has notified them in time. End-users of the software have nothing to worry about: they only need to install the recommended updates."

"Our findings show, however, that the measures taken by Intel make SGX enclave software up to 2 to even 19 times slower."

What are SGX enclaves?

Computer systems are made up of different layers, making them very complex. Every layer also contains millions of lines of computer code. As this code is still written manually, the risk for errors is significant. If such an error occurs, the entire computer system is left vulnerable to attacks. You can compare it to a skyscraper: if one of the floors becomes damaged, the entire building might collapse.

Viruses exploit such errors to gain access to sensitive or personal information on the computer, from holiday pictures and passwords to business secrets. To protect their processors against this kind of intrusions, IT company Intel introduced an innovative technology in 2015: Intel Software Guard eXtensions (Intel SGX). This technology creates isolated environments in the computer's memory, so-called enclaves, where data and programs can be used securely.

"If you look at a computer system as a skyscraper, the enclaves form a vault", researcher Jo Van Bulck explains. "Even when the building collapses the vault should still guard its secrets - including passwords or medical data."

The technology seemed watertight until August 2018, when researchers at KU Leuven discovered a breach. Their attack was dubbed Foreshadow. In 2019, the Plundervolt attack revealed another vulnerability. Intel has released updates to resolves both flaws.