China shows how errors in large-scale, convective tropical precipitation simulations using current global models may impact climate feedback

Heavy precipitation can cause large economic, ecological, and human life losses. Both its frequency and intensity have increased due to climate change influences. Therefore, it is becoming increasingly critical to accurately model and predict heavy precipitation events. However, current global climate models (GCMs) struggle to correctly model tropical precipitation, particularly heavy rainfall. Atmospheric scientists are working to identify and minimize model biases that arise when attempting to model large-scale and convective precipitation.

"Unrealistic convective and large-scale precipitation components essentially contribute to the biases of simulated precipitation," said Prof. Jing Yang, a faculty member in the Geographical Science Department at Beijing Normal University.

Prof. Yang and her postgraduate student Sicheng He, along with Qing Bao from the Institute of Atmospheric Physics at the Chinese Academy of Sciences, explored the challenges and barriers to achieving realistic rainfall modeling from the perspective of convective and large-scale precipitation. Heavy rain in Shenzhen on April 19, 2019 caused extensive flight delays, affecting thousands of passengers.  CREDIT Sicheng He

"Although sometimes total rainfall amounts can be simulated well, the convective and large-scale precipitation partitions are incorrect in the models," remarked Yang.

To clarify the status of convective and large-scale precipitation components within current GCMs, researchers comprehensively classified 16 CMIP6 models focusing on tropical heavy rainfall. In most cases, results show much more rainfall resolved from large-scale rainfall rather than convective components of CMIP6 model simulations, which is not realistic.

The research team divided model components into three distinct groups to better assess based on the percentage of large-scale precipitation: (1) whole mid-to-lower tropospheric wet biases (60%-80% large-scale rainfall); (2) mid-tropospheric wet peak (50% convective/large-scale rainfall); and (3) lower-tropospheric wet peak (90%-100% large-scale rainfall).

These classifications are closely associated with the vertical distribution of moisture and clouds within the tropical atmosphere. Because the radiative effects of low and high clouds differ, the associated differences in vertical cloud distributions can potentially cause different climate responses, therefore considerable uncertainties in climate projections.

The study is recently published in Advances in Atmospheric Sciences. "The associated vertical distribution of unique clouds potentially causes different climate feedback, suggesting accurate convective/large-scale rainfall partitions are necessary to reliable climate projection," noted Yang.

Future sparkles for diamond-based quantum technology

Two research breakthroughs are poised to accelerate the development of synthetic diamond-based quantum technology

Marilyn Monroe famously sang that diamonds are a girl's best friend, but they are also very popular with quantum scientists - with two new research breakthroughs poised to accelerate the development of synthetic diamond-based quantum technology, improve scalability, and dramatically reduce manufacturing costs.

While silicon is traditionally used for computer and mobile phone hardware, diamond has unique properties that make it particularly useful as a base for emerging quantum technologies such as quantum supercomputers, secure communications, and sensors.

However, there are two key problems; cost, and difficulty in fabricating the single-crystal diamond layer, which is smaller than one-millionth of a meter.

A research team from the ARC Centre of Excellence for Transformative Meta-Optics at the University of Technology Sydney (UTS) led by Professor Igor Aharonovich has just published two research papers Nanoscale and Advanced Quantum Technologies, that address these challenges. An artist's impression of a diamond building block in a future photonic circuit.  CREDIT Igor Aharonovich

"For a diamond to be used in quantum applications, we need to precise engineer 'optical defects' in the diamond devices - cavities and waveguides - to control, manipulate and read out information in the form of qubits - the quantum version of classical computer bits," said Professor Aharonovich.

"It's akin to cutting holes or carving gullies in a super-thin sheet of diamond, to ensure light travels and bounces in the desired direction," he said.

To overcome the "etching" challenge, the researchers developed a new hard masking method, which uses a thin metallic tungsten layer to pattern the diamond nanostructure, enabling the creation of one-dimensional photonic crystal cavities.

"The use of tungsten as a hard mask addresses several drawbacks of diamond fabrication. It acts as a uniform restraining conductive layer to improve the viability of electron beam lithography at nanoscale resolution," said the lead author of a paper in Nanoscale, UTS Ph.D. candidate Blake Regan.

"It also allows the post-fabrication transfer of diamond devices onto the substrate of choice under ambient conditions. And the process can be further automated, to create modular components for diamond-based quantum photonic circuitry," he said.

The tungsten layer is 30nm wide - around 10,000 times thinner than a human hair - however, it enabled a diamond to etch of over 300nm, a record selectivity for diamond processing.

A further advantage is that removal of the tungsten mask does not require the use of hydrofluoric acid - one of the most dangerous acids currently in use - so this also significantly improves the safety and accessibility of the diamond nanofabrication process.

To address cost and improve scalability, the team further developed an innovative step to grow single-crystal diamond photonic structures with embedded quantum defects from a polycrystalline substrate.

"Our process relies on a lower-cost large polycrystalline diamond, which is available as large wafers, unlike the traditionally used high-quality single crystal diamond, which is limited to a few mm2," said UTS Ph.D. candidate Milad Nonahal, lead author of the study in Advanced Quantum Technologies.

"To the best of our knowledge, we offer the first evidence of the growth of a single crystal diamond structure from a polycrystalline material using a bottom-up approach - like growing flowers from seed," he added.

"Our method eliminates the need for expensive diamond materials and the use of ion implantation, which is key to accelerating the commercialization of diamond quantum hardware," said UTS Dr. Mehran Kianinia, a senior author on the second study.

Astronomers use Subaru Telescope data to chart the expansion history of the universe with supernovae

An international research team analyzed a database of more than 1000 supernova explosions and found that models for the expansion of the Universe best match the data when a new time-dependent variation is introduced. If proven correct with future, higher-quality data from the Subaru Telescope and other observatories, these results could indicate still unknown physics working on the cosmic scale.

Edwin Hubble's observations over 90 years ago showing the expansion of the Universe remain a cornerstone of modern astrophysics. But when you get into the details of calculating how fast the Universe was expanding at different times in its history, scientists have difficulty getting theoretical models to match observations. Schematical representation of the expansion of the Universe over the course of its history.  CREDIT NAOJ

To solve this problem, a team led by Maria Dainotti (Assistant Professor at the National Astronomical Observatory of Japan and the Graduate University for Advanced Studies, SOKENDAI in Japan and an affiliated scientist at the Space Science Institute in the U.S.A.) analyzed a catalog of 1048 supernovae which exploded at different times in the history of the Universe. The team found that the theoretical models can be made to match the observations if one of the constants used in the equations, appropriately called the Hubble constant, is allowed to vary with time.

There are several possible explanations for this apparent change in the Hubble constant. A likely but boring possibility is that observational biases exist in the data sample. To help correct for potential biases, astronomers are using Hyper Suprime-Cam on the Subaru Telescope to observe fainter supernovae over a wide area. Data from this instrument will increase the sample of observed supernovae in the early Universe and reduce the uncertainty in the data.

But if the current results hold up under further investigation, if the Hubble constant is in fact changing, that opens the question of what is driving the change. Answering that question could require a new, or at least modified, version of astrophysics.

These results will appear as M.G. Dainotti et al. "On the Hubble Constant Tension in the SNe Ia Pantheon Sample" in the Astrophysical Journal on May 17, 2021.