Nevada's Ren, Hanan build models predicting future wildfire frequency, size, intensity

Researchers attempting to help predict how the wildfire hazard will change due to various factors over the next several decades have some good news, and some bad news. The good news is, that wildfire occurrence and intensity will likely decrease in several locations in the future. The bad news: decreases may not occur for another 50 years, and wildfire hazard will likely get worse before it gets better. A photo taken of the Big Wood River Basin adjacent to the study site. Photo by Erin Hanan.

“There are so many factors that we need to consider and better understand if we want to predict how the frequency, size, and intensity of wildfires will change over time,” said Erin Hanan, a University of Nevada, Reno researcher with the University’s Experiment Station and an assistant professor in the College of Agriculture, Biotechnology & Natural Resources. “Our two studies looked at how changes in temperature, rainfall, and atmospheric carbon dioxide may interact with and influence plant growth, turnover, and decomposition, and how those processes, in turn, affect fuel loading and fuel moisture in different plant communities, which are two key factors driving wildfire regimes in the West.”

Making the case for more detailed research needed on plant decomposition

Hanan is the coauthor of the two related journal articles about the research. She is the lead author of the first article, in the Journal of Advances in Modeling Earth Systems, that focuses on how plants may decompose, or break down (think composting), under different climate scenarios, thus affecting the fuel load or amount of litter on the ground that can burn.

“Many of the decomposition algorithms in models that have come from small experiments and specific locations just aren’t going to be accurate all the time,” Hanan said. “Accumulation of fine fuels and the rate at which those fuels or plant parts break down is highly sensitive to several factors, such as temperature and rainfall. That’s what this research verified. So, unless we get better at estimating fuel load, or accumulation, and decomposition of fine fuels under different climate scenarios, it is going to be very difficult to build accurate models predicting future wildfire regimes.”

A case study for semi-arid watersheds in central Idaho leads to the good news-bad news forecast


Armed with this information, Jianning Ren, a postdoctoral scholar in Hanan’s lab group, set out to examine different future climate scenarios for semi-arid watersheds that more accurately account for the various ways that higher temperatures, changes in moisture, and increasing atmospheric CO2 can influence fuel load, fuel moisture, and wildfire regimes. 

Ren, Hanan, and other researchers integrated climate data from complex General Circulation Models, with data from a representative semi-arid watershed in central Idaho, Trail Creek, which is characterized by cold, wet winters and warm, dry summers. Elevations in the watershed range from 1,760 to 3,478 meters, creating several different plant communities – grasses, shrubs, forests, mixed vegetation, and areas with little vegetation at all.

The article detailing the research, published in Earth’s Future for which Ren is the lead author, contains various detailed graphs modeling probable fire regime outcomes for various plant communities. Outcomes were highly influenced by these observations:

  • increased plant growth, or fuel loading, resulting from increased atmospheric carbon dioxide (Plants take in carbon dioxide and convert it to energy for growth.)
  • decreased plant growth, or fuel loading, due to climatic warming (Plants struggle to grow when the environment becomes too arid.)
  • increased plant decomposition rates, which decrease fuel loading, also due to climatic warming (plant materials break down more quickly with heat)
  • dryer fuels, or plants, due to increased temperatures

“We found that these effects can sometimes work together to create a synergistic effect, or they can counteract each other, based on different scenarios,” Ren said. “In a nutshell, our models project:

  • In the 2040s, elevated CO2 promotes a net increase in plant growth despite possible decreases that can occur with warming and associated drought, so the result is increased fuel load, and increased fire hazard.
  • Using the data for the 2070s, climatic warming and drying becomes so intense that it outstrips the increased CO2 levels, in effect shutting down CO2’s ability to increase plant growth. So, burn area and probability decrease in the models for the 2070s. And, although there is an increase in fire weather for that period, decreases in fuel loading— caused by increases in decomposition and decreases in plant productivity—ultimately reduce wildfire for this period.” While decreasing wildfire hazard is potentially good news, this decrease results from ecological and hydrological degradation,” Hanan said.

Ren and Hanan noted that within each of the major plant communities – grasslands, shrubs, forests – results were quite consistent, adding validity to the findings.

“Across the grasslands we modeled, the change in warming didn’t matter nearly as much as the fuel loading,” Ren said. “It was pretty much entirely dependent on fuel loading, which makes sense. The grasslands in this area will always die and dry out. That’s their cycle. For the grasslands, it’s all about how much fuel you have to burn.”

Conversely, Ren noted that changes in fuel aridity and fuel loading both heavily influenced the wildfire predictions for areas dominated by shrubs and areas dominated by trees. But, both Hanan and Ren stressed that much more research is needed to make the models even more reliable.

“This is really just a start,” Hanan said. “And, the further out the predictions get, the less reliable they become, naturally. We are hoping to do more research on decomposition and to expand the research we did up at Trail Creek to other watersheds, improve the models, and scale them over larger areas. What we really hope is that all this stimulates more integrated research and modeling, and gets people talking. For a long time, the fire community and the biogeochemistry community weren’t necessarily talking. I think that’s starting to change. We’re seeing that it’s really important to think about, talk about, and quantify all these different factors as multidisciplinary teams.”

Russian researchers develop a new method for analyzing genetic admixture of populations

Researchers of the National Research University Higher School of Economics at the HSE International Laboratory of Statistical and Computational Genomics in Moscow, Russia together with their international colleagues have proposed a new statistical method for analyzing population admixture that makes it possible to determine the time and number of migration waves more accurately. The history of Colombians and Mexicans (descendants of Native Americans, Spaniards, and Africans) features two episodes of admixture that occurred about 350 and 200 years ago for Mexicans and 400 and 100 years ago for Colombians. The results were published in the Plos Genetics journal. © iStock

When Francis Crick and James Watson deciphered the structure of DNA in 1953, they declared that they had found the secret of life. Indeed, all life on Earth is reproduced by constant cell division and copying of its genetic material. DNA is passed down from generation to generation, and the human genome is a mosaic of genetic fragments of our ancestors from different times. To understand the origins of the genetic diversity of modern humans, it is necessary to study the history of populations: where our ancestors lived, when and where they migrated, and when and how they mixed.

The history of population admixture can be uncovered by analyzing the connections between human genetic variants. Our genome has genetic material from our father and mother; then we pass on new combinations of genetic variants, a mosaic made up of the genomes of our parents, to our descendants. This phenomenon is called recombination.

For example, a Spanish mother and a Native American father will have a child with one Spanish and one American set of chromosomes. Their child in turn will pass on a set of chromosomes that includes a combination of sections of Spanish and American origin to their descendants (the second set of chromosomes will be inherited from the other parent). The origin of these sections can be determined by the sequences of genetic variants typical for a particular population. In each new generation, recombination will mix sections of different origins more and more, breaking up these typical genetic sequences. Over time, they disintegrate, finally mixing.

Thus, by calculating the correlation between genetic variants on different parts of chromosomes and analyzing the strength of their connections, we can say how many generations ago population admixture occurred.

Earlier methods of analyzing the genetic admixture of populations were capable of estimating the time of the last admixture event. The algorithm was based on the analysis of the connection strength between pairs of genetic variants. Researchers from the HSE International Laboratory of Statistical and Computational Genomics and their international colleagues proposed analyzing triple variants. This statistical method makes it possible to model more complex scenarios of population admixture, for example, to identify two episodes of admixture and determine how many generations ago they occurred.

"Let's imagine that ships with European settlers land on the shores of America for the first time. Europeans start exploring new territories and mixing with the indigenous population of America. However, after a few generations, more ships with Europeans arrive in America. Our method allows us to see that there were two waves of resettlement, two episodes of admixture in different time periods," explains Mikhail Shishkin, co-author of the article, research assistant of the laboratory, and MIEM student.

As an example, the paper’s authors analyzed genetic samples of the population of Colombians and Mexicans from the genetic database of 1000 Genomes. Both populations appeared as a result of the admixture of Native Americans, Spaniards, and Africans. The results showed that the history of both populations featured two waves of admixture, which occurred 13 and 8 generations (350 and 200 years) ago for Mexicans and 15 and 4 generations (400 and 100 years) ago for Colombians.

Oxford researchers develop new AI system using light to learn associatively

Researchers at Oxford University’s Department of Materials, working in collaboration with colleagues from Exeter and Munster have developed an on-chip optical processor capable of detecting similarities in datasets up to 1,000 times faster than conventional machine learning algorithms running on electronic processors.  An illustration of Pavlov's experiment on associated learning and a computer chip  CREDIT Zengguang Cheng

The new research published in Optica took its inspiration from Nobel Prize laureate Ivan Pavlov’s discovery of classical conditioning. In his experiments, Pavlov found that by providing another stimulus during feedings, such as the sound of a bell or metronome, his dogs began to link the two experiences and would salivate at the sound alone. The repeated associations of two unrelated events paired together could produce a learned response – a conditional reflex.

Co-first author Dr. James Tan You Sian, who did this work as part of his DPhil in the Department of Materials, University of Oxford said: ‘Pavlovian associative learning is regarded as a basic form of learning that shapes the behavior of humans and animals – but adoption in AI systems is largely unheard of. Our research on Pavlovian learning in tandem with optical parallel processing demonstrates the exciting potential for a variety of AI tasks.’

The neural networks used in most AI systems often require a substantial number of data examples during a learning process – training a model to reliably recognize a cat could use up to 10,000 cat/non-cat images – at a computational and processing cost.

Rather than relying on backpropagation favored by neural networks to ‘fine-tune’ results, the Associative Monadic Learning Element (AMLE) uses a memory material that learns patterns to associate together similar features in datasets – mimicking the conditional reflex observed by Pavlov in the case of a ‘match’.   

The AMLE inputs are paired with the correct outputs to supervise the learning process, and the memory material can be reset using light signals. In testing, the AMLE could correctly identify cat/non-cat images after being trained with just five pairs of images.  

The considerable performance capabilities of the new optical chip over a conventional electronic chip are down to two key differences in design:

  • a unique network architecture incorporating associative learning as a building block rather than using neurons and a neural network
  • the use of ‘wavelength-division multiplexing’ to send multiple optical signals on different wavelengths on a single channel to increase computational speed.

The chip hardware uses light to send and retrieve data to maximize information density – several signals on different wavelengths are sent simultaneously for parallel processing which increases the detection speed of recognition tasks. Each wavelength increases the computational speed.

Professor Wolfram Pernice, co-author from Münster University explained: "The device naturally captures similarities in datasets while doing so in parallel using light to increase the overall computation speed – which can far exceed the capabilities of conventional electronic chips."

An associative learning approach could complement neural networks rather than replace them clarified co-first author Professor Zengguang Cheng, now at Fudan University.

"It is more efficient for problems that don’t need substantial analysis of highly complex features in the datasets’ said Professor Cheng. ‘Many learning tasks are volume based and don’t have that level of complexity – in these cases, associative learning can complete the tasks more quickly and at a lower computational cost."

"It is increasingly evident that AI will be at the center of many innovations we will witness in the coming phase of human history. This work paves the way toward realizing fast optical processors that capture data associations for particular types of AI computations, although there are still many exciting challenges ahead," said Professor Harish Bhaskaran, who led the study.

The full paper, ‘Monadic Pavlovian associative learning in a backpropagation-free photonic network,’ is available in the journal Optica.

Acceldata offers cost-effective, flexible alternative to legacy Cloudera

The solution provides a simplified option, increased ROI, and improved performance amidst the threat of forced migration

Acceldata has announced that it would offer an alternative option for legacy Hortonworks Data Platform (HDP) / Cloudera Data Hub (CDH) customers. Cloudera has announced that it plans to suspend support for HDP/CDH on Sept. 30, 2022. This move will require legacy customers to upgrade in order to receive continued support from Cloudera.

Acceldata offers a flexible, cost-effective, and highly optimized software and support solution that gives customers options to remain on-premises or migrate to cloud data platforms. Acceldata provides an economical and simplified alternative to forced migration and removes the need for organizations to go through the expensive process of rewriting data pipelines to work with the recent Cloudera release.  

Acceldata’s software solution for long-term data platform independence delivers increased performance, reliability, and cost savings. With Acceldata, customers can improve workload performance by 30-40% and save up to 70% on Cloudera support licenses with improved turnaround times for issue resolution.

“This forced migration is unsettling for the large number of companies that still run Hadoop on-premises. Acceldata’s software and service offers a win-win for legacy Cloudera customers as they are no longer locked in with a single vendor and now have the freedom to choose which options work best for them,” said Chandra Sharma, head of customer success, Acceldata. “By eliminating the need to update data pipelines, Acceldata’s data observability platform can significantly decrease cost and increase performance from what they had previously with Cloudera.”

Acceldata offers the following options:

  • Stay on-premises with your current Hadoop release, without the need for Cloudera support 
  • Migrate to the cloud warehouse of your choice and use Acceldata’s insight into application workloads to increase adoption and align cloud warehouse performance, cost, and value 
  • Move to Apache open source confidently and be 100% vendor/distribution independent

Robi Axiata, one of Bangladesh's largest mobile network operators, has become one of the most important telecommunications providers in Asia with nearly $1 billion in annual revenue and more than 50 million subscribers. The company operates a large, complex analytics environment, which runs on a six-petabyte data warehouse for insights into real-time customer events. Previously using Cloudera, Robi Axiata turned to Acceldata when it was experiencing frequent issues with its Hadoop-based data warehouse. 

"With Cloudera, it took 3-6 weeks to perform root cause analysis for system issues, which was creating frustration and potentially causing damage to our customer relationships. But, with Acceldata, we are now completing our root cause analysis in only one minute,” said Mohammad Solaimun Rasel, VP, Platform Planning and Management Department, Robi Axiata Limited.

Chinese researchers find human activities increase likelihood of more extreme heatwaves

July 19 was the hottest day ever recorded in the United Kingdom, with temperatures surpassing 40 degrees Celsius (about 104 degrees Fahrenheit). The heatwave serves as an early preview of what climate forecasters theorized will be typical summer weather in the U.K. in 2050. The heat continues across Europe today, as well as in the United States, where more than a third of the country is under heat warnings. Shading represents surface air temperature anomalies, and the green vector denotes jetstream (a narrow band of very strong westerly air currents near the altitude of the tropopause). Two blue vectors indicate that the heatwave is related to anomalous circulations in the North Pacific and the Arctic.

The temperatures harken back to just over a year ago when nearly 1,500 people died during a late June heatwave that more than doubled average temperatures in the United States and Canada.

Will temperatures continue to rise, leading to more frequent extreme heat events?

Yes, according to the latest analysis of the atmospheric circulation patterns and human-caused emissions that led to the 2021 heatwave in North America. The findings, published on July 22 in Advances in Atmospheric Sciences, may also explain the U.K.'s current heatwave.

The research team found that greenhouse gases are the primary reason for increased temperatures in the past and will likely continue to be the main contributing factor, with simulations showing that extreme heatwave events will increase by more than 30 percentage points in the coming years. Most of that increased probability is the result of greenhouse gases, according to their results.

“An extraordinary and unprecedented heatwave swept western North America in late June of 2021, resulting in hundreds of deaths and a massive die-off of sea creatures off the coast as well as horrific wildfires,” said lead author Chunzai Wang, a researcher in the Southern Marine Science and Engineering Guangdong Laboratory and head of the State Key Laboratory of Tropical Oceanography at the South China Sea Institute of Oceanology, Chinese Academy of Sciences (CAS).

“In this paper, we studied the physical processes of internal variability, such as atmospheric circulation patterns, and external forcing, such as anthropogenic greenhouse gases.”

Atmospheric circulation patterns describe how air flows and influences surface air temperatures around the planet, both of which can change based on natural warming from the Sun and atmospheric internal variability, as well as Earth’s rotation. These configurations are responsible for daily weather, as well as the long-term patterns comprising climate.  Using observational data, the researchers identified that three atmospheric circulation patterns co-occurred during the 2021 heatwave: the North Pacific pattern, the Arctic-Pacific Canada pattern, and the North American pattern.

“The North Pacific pattern and the Arctic-Pacific Canada pattern co-occurred with the development and mature phases of the heatwave, whereas the North America pattern coincided with the decaying and eastward movements of the heatwave,” Wang said. “This suggests the heatwave originated from the North Pacific and the Arctic, while the North America pattern ushered the heatwave out.”

But atmospheric circulation patterns can co-occur — and have before — without triggering an extreme heatwave, so how much was the 2021 event influenced by human activities? Wang and the team used the internationally curated, tested and assessed models from the World Climate Research Programme, specifically the Detection Attribution Model Comparison models of the Coupled Model Intercomparison Project Phase 6 (CMIP6).

“From the CMIP6 models, we found that it is likely that global warming associated with greenhouse gases influences these three atmospheric circulation pattern variabilities, which, in turn, led to a more extreme heatwave event,” Wang said. “If appropriate measures are not taken, the occurrence probability of extreme heatwaves will increase and further impact the ecological balance, as well as sustainable social and economic development.”

Other contributors include co-corresponding author Jiayu Zheng and two students from the University of CAS: Wei Lin and Yuqing Wang.