Dark energy survey releases most precise look at the Universe's evolution

The first three years of survey data use observations of 226 million galaxies over 1/8 of the sky

In 29 new academic papers, the Dark Energy Survey examines the largest-ever maps of galaxy distribution and shapes, extending more than 7 billion light-years across the Universe. The extraordinarily precise analysis, which includes data from the survey's first three years, contributes to the most powerful test of the current best model of the Universe, the standard cosmological model. However, hints remain from earlier DES data, and other experiments that matter in the Universe today are a few percent less clumpy than predicted.

New results from the Dark Energy Survey (DES) use the largest-ever sample of galaxies observed over nearly one-eighth of the sky to produce the most precise measurements to date of the Universe's composition and growth. The Dark Energy Survey camera (DECam) at the SiDet clean room. The Dark Energy Camera was designed specifically for the Dark Energy Survey. It was funded by the Department of Energy (DOE) and was built and tested at DOE's Fermilab.  CREDIT DOE/FNAL/DECam/R. Hahn/CTIO/NOIRLab/NSF/AURA

DES images of the night sky using the 570-megapixel Dark Energy Camera on the National Science Foundation's Víctor M. Blanco 4-meter Telescope at Cerro Tololo Inter-American Observatory (CTIO) in Chile, a Program of NSF's NOIRLab. One of the most powerful digital cameras in the world, the Dark Energy Camera was designed specifically for DES. It was funded by the Department of Energy (DOE) and was built and tested at DOE's Fermilab.

Over the course of six years, from 2013 to 2019, DES used 30% of the time on the Blanco Telescope and surveyed 5000 square degrees -- almost one-eighth of the entire sky -- in 758 nights of observation, cataloging hundreds of millions of objects. The results announced today draw on data from the first three years -- 226 million galaxies observed over 345 nights -- to create the largest and most precise maps yet of the distribution of galaxies in the Universe at relatively recent epochs. The DES data were processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

"NOIRLab is a proud host for and member of the DES collaboration," said Steve Heathcote, CTIO Associate Director. "Both during and after the survey, the Dark Energy Camera has been a popular choice for community and Chilean astronomers."

At present, the Dark Energy Camera is used for programs covering a huge range of science including cosmology. The Dark Energy Camera science archive, including DES Data Release 2 on which these results are based, is curated by the Community Science and Data Center (CSDC), a Program of NSF's NOIRLab. CSDC provides software systems, user services, and development initiatives to connect and support the scientific missions of NOIRLab's telescopes, including the Blanco telescope at CTIO.

Since DES studied nearby galaxies as well as those billions of light-years away, its maps provide both a snapshot of the current large-scale structure of the Universe and a view of how that structure has evolved over the past 7 billion years. Ten areas in the sky were selected as "deep fields" that the Dark Energy Camera imaged several times during the survey, providing a glimpse of distant galaxies and helping determine their 3D distribution in the cosmos. The image is teeming with galaxies -- in fact, nearly every single object in this image is a galaxy. Some exceptions include a couple of dozen asteroids as well as a few handfuls of foreground stars in our own Milky Way.  CREDIT Dark Energy Survey/DOE/FNAL/DECam/CTIO/NOIRLab/NSF/AURA Acknowledgments: T.A. Rector (University of Alaska Anchorage/NSF's NOIRLab), M. Zamani (NSF's NOIRLab) & D. de Martin (NSF's NOIRLab)

Ordinary matter makes up only about 5% of the Universe. Dark energy, which cosmologists hypothesize drives the accelerating expansion of the Universe by counteracting the force of gravity, accounts for about 70%. The last 25% is dark matter, whose gravitational influence binds galaxies together. Both dark matter and dark energy remain invisible. DES seeks to illuminate their nature by studying how the competition between them shapes the large-scale structure of the Universe over cosmic time.

To quantify the distribution of dark matter and the effect of dark energy, DES relied mainly on two phenomena. First, on large scales galaxies are not distributed randomly throughout space but rather form a weblike structure that is due to the gravity of the dark matter. DES measured how this cosmic web has evolved over the history of the Universe. The galaxy clustering that forms the cosmic web, in turn, revealed regions with a higher density of dark matter.

Second, DES detected the signature of dark matter through weak gravitational lensing. As light from a distant galaxy travels through space, the gravity of both ordinary and dark matter in the foreground can bend its path, as if through a lens, resulting in a distorted image of the galaxy as seen from Earth. By studying how the apparent shapes of distant galaxies are aligned with each other and with the positions of nearby galaxies along the line of sight, DES scientists were able to infer the clumpiness of the dark matter in the Universe.

To test cosmologists' current model of the Universe, DES scientists compared their results with measurements from the European Space Agency's orbiting Planck observatory. Planck used light known as the cosmic microwave background to peer back to the early Universe, just 400,000 years after the Big Bang. The Planck data give a precise view of the Universe 13 billion years ago, and the standard cosmological model predicts how the dark matter should evolve to the present.

Combined with earlier results DES provides the most powerful test of the current best model of the Universe to date, and the results are consistent with the predictions of the standard model of cosmology. However, hints remain from DES and several previous galaxy surveys that the Universe today is a few percent less clumpy than predicted.

Ten regions of the sky were chosen as "deep fields" that the Dark Energy Camera imaged repeatedly throughout the survey. Stacking those images together allowed the scientists to glimpse more distant galaxies. The team then used the redshift information from the deep fields to calibrate the rest of the survey region. This and other advancements in measurements and modeling, coupled with a threefold increase in data compared to the first year, enabled the team to pin down the density and clumpiness of the Universe with unprecedented precision.

DES concluded its observations of the night sky in 2019. With the experience gained from analyzing the first half of the data, the team is now prepared to handle the complete dataset. The final DES analysis is expected to paint an even more precise picture of the dark matter and dark energy in the Universe.

The DES collaboration consists of over 400 scientists from 25 institutions in seven countries.

"The collaboration is remarkably young. It's tilted strongly in the direction of postdocs and graduate students who are doing a huge amount of this work," said DES Director and spokesperson Rich Kron, who is a Fermilab and University of Chicago scientist. "That's really gratifying. A new generation of cosmologists is being trained using the Dark Energy Survey."

The methods developed by the team have paved the way for future sky surveys such as the Rubin Observatory Legacy Survey of Space and Time. "DES shows that the era of big survey data has well and truly begun," notes Chris Davis, NSF's Program Director for NOIRLab. "DES on NSF's Blanco telescope has set the scene for the remarkable discoveries to come with Rubin Observatory over the coming decade."

Spanish researchers study one of the largest databases of neuronal types

An international collaboration between the Institute Cajal in Madrid, Spain, and George Mason University in Virginia, USA maps critical measurements of activity in vivo to more than 120 types of neurons from the brain region responsible for autobiography

The study, which is published in the journal PLOS Biology, represents the most comprehensive mapping performed to date between the neural activity recorded in vivo and identified neuron types. This major breakthrough may enable biologically meaningful computer modeling of the full neuronal circuit of the hippocampus, a region of the brain involved in memory function.

Circuits of the mammalian cerebral cortex are made up of two types of neurons: excitatory neurons, which release a neurotransmitter called glutamate, and inhibitory neurons, which release GABA (gamma-amino butanoic acid), the main inhibitor of the central nervous system. "A balanced dialogue between the 'excitatory' and 'inhibitory' activities is critical for brain function. Identifying the contribution from the several types of excitatory and inhibitory cells is essential to better understand brain operation", explains Liset Menendez de la Prida, the Director of the Laboratorio de Circuitos Neuronales at the Institute Cajal who leads the study at the CSIC. Inhibitory neuron (white) recorded and labeled in vivo, together with other inhibitory cell types (blue and yellow)  CREDIT Elena Cid. Instituto Cajal (CSIC)

In the case of the hippocampus, a brain region involved in memory function, there are 39 known types of excitatory principal cells and 85 types of inhibitory neurons. Activity patterns of these several cell types are very specific. All this information is now compiled in Hippocampome.org, a database created five years ago by the Center for Neural Informatics at George Mason University. This database integrates all current knowledge about the morphology, biophysics, genetic identity, connectivity, and firing patterns of more than 120 types of neurons identified in the rodent hippocampus.

This upgrade, which has been possible thanks to a careful recollection, identification, and classification of neurons at the Institute Cajal, will allow the annotation and classification of high-density brain recordings, critical for brain-machine interfaces. "Much of our knowledge about nerve cells to date comes from laboratory preparations that separate tissue sections of interest from the rest of the brain," says Giorgio Ascoli, a George Mason University Professor who directs the Center for Neural Informatics. "This new linkage to activity recorded in live animals is a game-changer towards real-scale computer models of brain and memory functions", adds Ascoli.

Novel computational models and machine learning applications

New information provided by Hippocampome.org may have an impact on the development of more realistic predictive models that consider neural diversity as a source of information. The results of the work will help to decode brain signals associated with complex cognitive processes for which the information of single-cell activity is essential.

This is the case of the hippocampus, which builds a neural representation of sequential experiences that is later reactivated in a very specific way for encoding, storing, and retrieving memories. In order to better understand this code, we need to decompose mixed neuronal representations. The additional data included in Hippocampome.org may now provide the needed labels to begin deconstructing the code using modern tools from artificial intelligence.

UK scientists call for international investment to tackle major wheat losses

Urgent investment in new tools is needed to address major global losses of wheat crops which cost £22 billion per year.

Leading scientific experts are calling for governments around the world to come together and fund a new international research platform, to reduce the impact of major wheat pathogens and improve global food security.

The John Innes Centre is calling for an internationally coordinated approach to deliver a new 'R-Gene Atlas', which would help identify new genetic solutions conferring disease resistance for crops, which could be bred into commercial wheat varieties.

Globally, we lose one-fifth of the projected wheat yield annually to pests and pathogens totaling losses of 209 million tonnes, worth £22 billion ($31 billion). The climate emergency has the capacity to bring further disruption to global food supplies, as a changing environment brings new types of pests and diseases and increases their spread. Wheat Atlas - Investment call for genomic tools  CREDIT John Innes Centre

To minimize these losses, and to reduce reliance on chemical solutions, the team calls for broader use of disease resistance to be found in the genome of wheat and its wild relatives. The aim is to provide long-lasting molecular protection against wheat's major pathogens including wheat rusts, blotch diseases, powdery mildew, and wheat blast.

In 2016 global trade saw the wheat blast fungus, typically isolated to South America, arrive in Bangladesh, where it destroyed 15,000 hectares of wheat, resulting in yield losses of 25-30% and threatening wheat production across South Asia.

Wheat R genes work by recognizing corresponding molecules in the pathogen called effectors. By identifying the effectors present in pathogen and pest populations, more durable combinations or "stacks" of R genes could be designed.

The R-gene atlas will be a free online portal containing this genetic information and enabling breeders to design gene stacks using computer modeling before starting their breeding in the field.

It will enable users to design molecular markers that could be used to find out what resistance genes they already have in their breeding program or wheat populations.

The idea builds upon the recent surge in genomic resources available to researchers in wheat, facilitated by advancements in sequencing technologies and bioinformatics. In the past few years, researchers at the John Innes Centre and The Sainsbury Laboratory have rapidly identified and cloned resistance genes in wheat and its wild relatives using technologies such as AgRenSeq, MutRenSeq, and MutChromSeq.

The new proposal details how the molecular components involved in disease resistance - R genes and effectors - could be captured from both the host and pathogen. Whole-genome sequencing would be carried out on diversity panels of wheat, its progenitors, and domesticated and wild relatives.

Association genetics, a method of seeking useful genetic variation, could then be used to look for correlations between the host genotype and disease resistance or susceptibility and the genes responsible for these traits could be identified. The researchers calculate it would cost around £41 million ($58.6 million) to establish the new platform at the required scale. Costed, detailed proposals for the R-Gene Atlas are set out in a new article in Molecular Plant.

This would include sequencing diversity panels of the pathogens and 10 host species of wheat, as well as funding 75 scientists across the world to carry out the work.

This, they suggest, could be funded by contributions of £2 million ($2.9 million) per G20 country spread over five years - a minor investment considering the current financial losses across the world to wheat diseases. This extensively collaborative funding model would spread the risk on a project which would have a global reward.

"Compared to the scale of the problem in yield losses to pests and pathogens, this represents excellent value for money," says first author Amber Hafeez. "It is unsustainable to continue feeding 20 percent of our wheat production to pathogens. Our enterprise applies cutting edge science to a global challenge that is increasing due to the climate emergency."

The proposal involves bringing together an international consortium to allow the project to draw upon existing expertise and resources.

"A lot of the pieces of the puzzle already exist, the idea is to bring them together to make sure we don't duplicate efforts," says Dr. Brande Wulff, corresponding author of the article. "We see it as a centrally coordinated model distributed around different countries, using existing capacity.

"Current projections suggest there will be 2.1 billion more people to feed by 2050 and developing disease-resistant crops will be a key part of sustainably feeding us all. We're determined to develop new ways to increase our genetic understanding and deploy it for the benefit of sustainable agriculture, but we cannot do this without investment."

"We are urging the G20 governments to invest in the consortium, which will bring disease resistance genes from the lab to field at a scale and speed needed to deal with the current crisis."

The idea has been trialed earlier this year and has drawn an enthusiastic response from the international wheat research community. Amber Hafeez said: "We have been delighted with the initial enthusiastic response to our proposals - many research groups and collaborators have welcomed the idea and we feel this confirms our belief that the time is right for this proposal."