Image of non-homogeneous glass with color elements concentrated in certain regions (photo: Nilanjana Shasmal/CeRTEV)
Image of non-homogeneous glass with color elements concentrated in certain regions (photo: Nilanjana Shasmal/CeRTEV)

Brazilian reserachers show how to unlock the strength of specialty glass with niobium oxide

Applications of specialty glass range from astronomy to medicine, as well as data and power transmission. The study combined spectroscopy and molecular dynamics (MD) and Monte-Carlo (MC) simulations to show how the structure of the material is affected by the addition of niobium oxide. 

A study conducted at the Center for Research, Education, and Innovation in Vitreous Materials (CeRTEV) in São Carlos, São Paulo state, Brazil, shows for the first time that including niobium oxide (Nb2O5) in silicate glass results in silica network polymerization, which increases bond density and connectivity, enhancing the mechanical and thermal stability of specialty glass.

The study was supported by FAPESP and reported in an article published in the journal Acta Materialia.

The first author of the article, Henrik Bradtmüller, is a postdoctoral researcher at the Federal University of São Carlos’s Center for Exact Sciences and Technology (CCET-UFSCar), with a fellowship from FAPESP. His supervisor is Edgar Dutra Zanotto, director of CeRTEV.

CeRTEV is hosted by UFSCar and is one of the Research, Innovation, and Dissemination Centers (RIDCs) funded by FAPESP.
“Our study combined experimental observations using nuclear magnetic resonance spectroscopy and Raman spectroscopy with computational modeling. Besides the results mentioned, we found that higher levels of niobium led to Nb2O5, clustering, and heightened electronic polarizability, with a significant impact on the optical properties of the glass,” Bradtmüller said.

It's important to remember that Raman spectroscopy is a method that provides accurate information about the molecular structure of materials. On the other hand, nuclear magnetic resonance (NMR) spectroscopy goes a step further by exploring the magnetic properties of atomic nuclei.

“Our strategy based on these two observational techniques plus computational modeling can be used to study functional elements of many other types of glass, including optical materials, bioactive glass, and glassy fast-ion conductors. This will facilitate the development of innovative glass formulations adapted for various applications,” Bradtmüller said.

Alongside the everyday applications of ordinary glass in containers, windows, and so on, high-quality glass has also become almost ubiquitous in today’s world, Bradtmüller noted. It is present in the microscopes and telescopes used by scientists, for example, in the optical fibers used to carry data and power, and in the glass-ceramic orthotic devices increasingly used in medicine. “In recognition of the role played by glass in contemporary society, the United Nations declared 2022 to be the International Year of Glass,” he said.

For advanced high-tech applications, materials scientists are using machine learning software and other computational resources to design glass with customized properties, but to do so they require reliable databases and structural parameters that take into account the physicochemical complexity of glass.

This is the relevance of the study by Bradtmüller and colleagues. “Glass intermediate oxides play a strategic role in this new technological moment. They don’t form glass under standard cooling in the laboratory, but they can make a positive contribution in the presence of other oxides by helping to build oxygen bridges and giving the glass the properties of interest. Niobium oxide is a good example,” he explained.

The glass that contains niobium (Nb) is valued for its non-linear optical properties, with potential applications in optoelectrical devices, and for mechanical properties relevant to the fabrication of bioactive materials. “Although studies had been conducted using Nb2O5 before our own, the structural role of Nb remained obscure, owing mainly to a lack of systematic spectroscopic characterization data. We set out to fill this knowledge gap in our study,” he said.

“We discovered through spectroscopy that the addition of Nb causes ‘polymerization’ of the silica-oxygen network, increasing the connectivity of the glass’s components. This clarified the role of Nb as a ‘network former’. Another highlight of the study is our demonstration that a new NMR technique we developed in 2020 using other materials applies to glass. This technique, which is called W-RESPDOR, can be used to measure the distance between two elements – in this case, lithium and Nb, which has such a challenging nucleus that it had never been measured with similar techniques.”

Computational modeling showed that lithium ions are randomly distributed in silica-based glass at the nanometric scale (5-10 nanometers), while Nb tends to form clusters at higher concentrations of Nb2O5, he explained, adding that this kind of structural arrangement had never been reported in the literature and is an original contribution of the study.

“In a broader perspective, the study points to an experimental and computational strategy to investigate the role played in glass by intermediate oxides with active nuclei for NMR spectroscopy,” Zanotto said.

The other authors of the article include Hellmut Eckert, Vice Director of CeRTEV and a specialist in NMR; and Anuraag Gaddam, a postdoctoral researcher specializing in computer simulations, with a scholarship from FAPESP and supervision by Eckert.

The conducted study has demonstrated that by adding niobium oxide to silicate glass, it is possible to achieve an increased bond density and connectivity, which results in better mechanical and thermal stability of specialty glass. This development is highly promising as it could lead to the creation of more reliable and durable specialty glass products. Further research and development, may open up a range of new possibilities for the use of silicate glass in various applications.

Data collected by the MOSAiC expedition to the central Arctic (shown), and analyzed by McKelvey School of Engineering researchers, revealed blowing snow as a previously unaccounted-for source of sea salt aerosols, impacting Arctic climate models. (Photo courtesy MOSAiC expedition)
Data collected by the MOSAiC expedition to the central Arctic (shown), and analyzed by McKelvey School of Engineering researchers, revealed blowing snow as a previously unaccounted-for source of sea salt aerosols, impacting Arctic climate models. (Photo courtesy MOSAiC expedition)

Wang's lab discovers that Arctic sea salt aerosols are underestimated, improving modeling

Atmospheric scientists, led by Jian Wang, have discovered that wind-blown snow in the central Arctic produces abundant fine sea salt aerosols, resulting in increased seasonal surface warming. Wang

The Arctic is a concerning outlier when it comes to global warming trends. It warms almost four times faster than the global average, and the role of aerosols in this warming is significant. Scientists have known for a long time that pollutants from other regions can build up in the Arctic atmosphere. This leads to a change in atmospheric chemistry, which absorbs sunlight and has an impact on local weather patterns, resulting in localized warming that melts ice and snow. While sea salt particles are the primary aerosol mass concentration, the mechanisms that produce them and their impact on the Arctic climate have yet to be fully understood.

Atmospheric scientists led by Jian Wang, director of the Center for Aerosol Science and Engineering and a professor of energy, environmental and chemical engineering at the McKelvey School of Engineering at Washington University in St. Louis, investigated the production and impact of sea salt aerosols on Arctic warming. Their results revealed abundant fine sea salt aerosol production from blowing snow in the central Arctic, increasing particle concentration and cloud formation.

“Over the past few decades, scientists have identified ‘Arctic haze’ as the primary source of aerosols in the Arctic during winter and spring. This haze results from the long-range transport of pollutants,” said Xianda Gong, first author on the study and a former postdoctoral researcher in Wang’s lab. “However, our study reveals that local blowing snow, which produces sea salt particles, contributes a more substantial fraction to the total aerosol population in the central Arctic.”

Wang’s team analyzed data collected by the Multidisciplinary Drifting Observatory for the Study of Arctic Climate (MOSAiC). Such observations are difficult to obtain — the MOSAiC expedition entailed international collaboration and freezing an icebreaker into the central Arctic ice pack to drift with the sea ice for an entire year — but essential to understanding the full picture of atmospheric conditions in the Arctic.

“The MOSAiC expedition let us observe how aerosols and clouds evolve over a year and led to this discovery,” Wang said. “Sea salt particles in the Arctic atmosphere aren’t surprising, since there are ocean waves breaking that will generate sea salt aerosols. But we expect those particles from the ocean to be pretty large and not very abundant.

“We found sea salt particles that were much smaller and in higher concentration than expected when there was blowing snow under strong wind conditions,” Wang said.

In the central Arctic, the coldest winter nights are the clearest, when heat from Earth can escape into space unimpeded. Under a cozy blanket of clouds, though, long-wave radiation gets trapped and contributes to warming, so any process that leads to increased cloud formation and lingering cloudiness also boosts surface temperatures. Small aerosol particles, including those fine sea salt aerosols produced by blowing snow that Wang’s team discovered, turn out to be very good for cloud formation.

“These sea salt particles can act as cloud condensation nuclei, leading to cloud formation,” Gong said. “Considering the absence of sunlight in the winter and spring Arctic, these clouds can trap surface long-wave radiation, thereby significantly warming the Arctic surface.”

Though scientists had not observed this phenomenon before, fine sea salt aerosols from blowing snow have always been part of the Arctic climate system. With this observational confirmation and systematic study, which revealed that sea salt particles produced from blowing snow account for about 30% of total aerosol particles, climate models can now be updated to include the effects of these fine particles.

“Model simulations that don’t include fine sea salt aerosols from blowing snow underestimate aerosol population in the Arctic,” Wang said. “Blowing snow happens regardless of human warming, but we need to include it in our models to better reproduce the current aerosol populations in the Arctic and to project future Arctic aerosol and climate conditions.”

The findings of this study suggest that model simulations of the Arctic atmosphere must include fine sea salt aerosols from blowing snow in order to accurately represent the aerosol population in the region. This is an encouraging development, as it means that scientists now have a better understanding of the Arctic atmosphere and can use this knowledge to develop more accurate models and predictions. With this new information, researchers can continue to work towards a more comprehensive understanding of the Arctic climate and its effects on the global environment.

How can the new model developed by Japanese scientists help improve tsunami warning systems?

Unlocking the secrets of the sea: How Japanese scientists are working to improve tsunami warning systems

The Hunga Tonga-Hunga Ha'apai volcano in Tonga erupted on January 15, 2022, causing massive amounts of energy to be released into the atmosphere and ocean, leading to tsunamis across the Pacific Ocean. The Shocks, Solitons, and Turbulence Unit of the Okinawa Institute of Science and Technology (OIST) in Japan has conducted research into the disturbances in the atmosphere and ocean during this event and has developed a supercomputer model to enhance the current tsunami early warning systems.

Stephen Winn, a research technician in the unit and first author of the research article, stated “It's important to know how the atmospheric wave changes in time to make accurate predictions that would be of use for warning systems.”    

Unlike a regular tsunami caused by a rapid movement of the seabed, the large waves caused by the Tonga explosion were also influenced by a pressure wave hundreds of kilometers wide released into the atmosphere. The atmospheric pressure wave first moved upwards and then spread outwards traveling at 1,141 km/h on average, about 400km/h faster than a regular tsunami can travel in deep water. It traveled around the earth causing waves as far away as the Mediterranean Sea. “This was the first event of its kind recorded in detail by modern instruments,” Prof. Emile Touber, leader of the Shocks, Solitons and Turbulence Unit stated. 

As the atmospheric wave travels above the ocean, it displaces the body of water underneath, creating waves that travel faster than a regular tsunami. “Normally, a tsunami wave created in the Pacific would not reach the Mediterranean because it would have to travel around land masses to get there, but atmospheric waves are not restricted, traveling over those land masses,” Dr. Adel Sarmiento, a postdoc researcher at the unit explained. This is why the wave can reach worldwide and has a broader impact than a regular tsunami.  

The scientists used measurements from the Tonga event to validate their model and used a state-of-the-art code, dNami, co-developed by Dr. Nicolas Alferez at the Conservatoire National des Arts et Métiers in Paris, France, to rapidly simulate the earth during the event using the supercomputer at OIST. The code allows them to create simulations in satisfactory resolution, faster than real-time, so that they are useful for improving warning systems in the future.

Prof. Touber explained that they can now more accurately predict the arrival time and height of a wave at a specific location and rapidly identify areas at high risk. 

Hurricanes and typhoons can also cause disturbances in the atmosphere that interact with the sea, causing significant water level changes that will affect coastlines. “With our model, we can explore what might happen to the water flow as it approaches the coast if the sea level changes by a certain amount with certain typical storm conditions,” Prof. Touber said. “This can help decide on the kind of coastal defense systems that should be put in place for storm-related surges.” 

A group of scientists from Japan conducted a study on the interactions between the ocean and atmosphere that occurred after the Tonga volcano eruption. The results of the study were very promising as they developed a model that has the potential to predict high-risk areas with great accuracy and improve the existing tsunami warning systems. This research is a major milestone in understanding the complex interplay between the ocean and atmosphere and has the potential to save many lives and properties in the future.

In the above map from the Southern California Earthquake Data Center, some of the individual pixels represent thousands of earthquakes.
In the above map from the Southern California Earthquake Data Center, some of the individual pixels represent thousands of earthquakes.

Discover the power of deep learning with UCSC seismologists' pioneering technology that enables them to predict earthquakes

Earthquake aftershock forecasting models have remained largely unchanged for more than 30 years. These models work well with limited data but struggle with the vast amount of seismology datasets that are now available. To overcome this limitation, researchers from the University of California, Santa Cruz, and the Technical University of Munich have developed a new model called Recurrent Earthquake foreCAST (RECAST). This model uses deep learning and is more flexible and scalable than the current earthquake forecasting models.

The scientists published a paper in Geophysical Research LettersGeophysical Research Letters, which shows that the new model outperforms the existing model, known as the Epidemic Type Aftershock Sequence (ETAS) model, for earthquake catalogs of about 10,000 events or more.

“The ETAS model approach was designed for the observations that we had in the 80s and 90s when we were trying to build reliable forecasts based on very few observations,” said Kelian Dascher-Cousineau, the lead author of the paper who recently completed his Ph.D. at UC Santa Cruz. “It’s a very different landscape today.” Now, with more sensitive equipment and larger data storage capabilities, earthquake catalogs are much larger and more detailed

“We’ve started to have million-earthquake catalogs, and the old model simply couldn’t handle that amount of data,” said Emily Brodsky, a professor of earth and planetary sciences at UC Santa Cruz and co-author on the paper. One of the main challenges of the study was not designing the new RECAST model itself but getting the older ETAS model to work on huge data sets to compare the two. 

“The ETAS model is kind of brittle, and it has a lot of very subtle and finicky ways in which it can fail,” said Dascher-Cousineau. “So, we spent a lot of time making sure we weren’t messing up our benchmark compared to actual model development.”

To continue applying deep learning models to aftershock forecasting, Dascher-Cousineau says the field needs a better system for benchmarking. To demonstrate the capabilities of the RECAST model, the group first used an ETAS model to simulate an earthquake catalog. After working with the synthetic data, the researchers tested the RECAST model using real data from the Southern California earthquake catalog.

They found that the RECAST model — which can, essentially, learn how to learn — performed slightly better than the ETAS model at forecasting aftershocks, particularly as the amount of data increased. The computational effort and time were also significantly better for larger catalogs.

This is not the first time scientists have tried using machine learning to forecast earthquakes, but until recently, the technology was not quite ready, said Dascher-Cousineau. New advances in machine learning make the RECAST model more accurate and easily adaptable to different earthquake catalogs.

The model’s flexibility could open up new possibilities for earthquake forecasting. With the ability to adapt to large amounts of new data, models that use deep learning could potentially incorporate information from multiple regions at once to make better forecasts about poorly studied areas.

“We might be able to train on New Zealand, Japan, California and have a model that's quite good for forecasting somewhere where the data might not be as abundant,” said Dascher-Cousineau.

Using deep-learning models will also eventually allow researchers to expand the type of data they use to forecast seismicity.

“We’re recording ground motion all the time,” said Brodsky. “So the next level is to use all of that information, not worry about whether we’re calling it an earthquake or not an earthquake but to use everything."

In the meantime, the researchers hope the model sparks discussions about the possibilities of the new technology.

“It has all of this potential associated with it,” said Dascher-Cousineau. “Because it is designed that way.”

The use of deep learning by UCSC seismologists for forecasting earthquakes is a groundbreaking development in the field of seismology. It not only provides an unprecedented level of accuracy in predicting seismic activity but also opens up new possibilities for understanding and preparing for the impacts of earthquakes. This research has the potential to save lives and property and serves as an example of the power of science and technology to improve the world we live in. With further research and development, deep learning could become an invaluable tool in the fight against the destructive forces of nature.

From left to right: Alberto Sánchez-Aguilera y Liset Menéndez de la Prida, from the Laboratory of Neural Circuits, Cajal Institute, CSIC; y Manuel Valiente y Mariam Al-Masmudi Martín, from the Brain Metastasis Group, CNIO./ A. Tabernero. CNIO
From left to right: Alberto Sánchez-Aguilera y Liset Menéndez de la Prida, from the Laboratory of Neural Circuits, Cajal Institute, CSIC; y Manuel Valiente y Mariam Al-Masmudi Martín, from the Brain Metastasis Group, CNIO./ A. Tabernero. CNIO

Unlock the secrets of brain tumors with machine learning: A revolutionary study by Spanish researchers

The research findings have been featured on the cover of the journal 'Cancer Cell'. According to authors from CSIC and CNIO, cognitive loss in patients with brain metastases may be caused by the interference created by cancer in neuronal circuits. When cancer spreads in the brain, it changes brain chemistry, thus disrupting communication between neurons. This is a distinct hypothesis from the one accepted so far and has significant implications for the diagnosis and treatment of brain metastasis. The authors have employed artificial intelligence to demonstrate that metastasis modifies brain activity. portadacancercellvaliente 1 e1693394362971 3442f

Nearly half of all patients with brain metastasis experience cognitive impairment. Until now, it was thought that this was due to the physical presence of the tumor pressing on neural tissue. However, this ‘mass effect’ hypothesis is flawed because there is often no relationship between the size of the tumor and its cognitive impact. Small tumors can cause significant changes, and large tumors can produce mild effects. Why is this?

The explanation may lie in the fact that brain metastasis hacks the brain’s activity, a study featured on Cancer Cell’s cover shows for the first time.

The authors, from the Spanish National Research Council (CSIC) and the Spanish National Cancer Research Centre (CNIO), have discovered that when cancer spreads (metastasizes) in the brain, it changes the brain’s chemistry and disrupts neuronal communication—neurons communicate through electrical impulses generated and transmitted by biochemical changes in the cells and their surroundings. 

 

In this study, the laboratories of Manuel Valiente (CNIO) and Liset Menéndez de La Prida (Cajal Institute CSIC) have collaborated within the EU-funded NanoBRIGHT project, aimed at developing new technologies for the study of the brain, and with the participation of other funding agencies such as MICINN, AECC, ERC, NIH, and EMBO.

Demonstration with artificial intelligence

The researchers measured the electrical activity of the brains of mice with and without metastases and observed that the electrophysiological recordings of the two groups of animals with cancer were different from each other. To be sure that this difference was attributable to metastases, they turned to artificial intelligence. They trained an automatic algorithm with numerous electrophysiological recordings, and the model was indeed able to identify the presence of metastases. The system was even able to distinguish metastases from different primary tumors—skin, lung, and breast cancer.

These results show that metastasis does indeed affect the brain’s electrical activity in a specific way, leaving clear and recognizable signatures.

For the authors, the study represents a “paradigm shift” in the basic understanding of the development of brain metastases and has implications for the prevention, early diagnosis, and treatment of this pathology.

On the trail of drugs against neurocognitive effects

In addition to recording changes in brain electrical activity in the presence of metastasis, the researchers have begun to explore the biochemical changes that might explain this alteration. By analyzing the genes expressed in the affected tissues, they have identified a molecule, EGR1, that may play an important role in this process. This finding opens up the possibility of designing a drug to prevent or alleviate the neurocognitive effects of brain metastasis.

As Manuel Valiente, head of the CNIO’s Brain Metastasis Group explains, “Our multidisciplinary study challenges the hitherto accepted assumption that neurological dysfunction, which is very common in patients with brain metastasis, is due solely to the mass effect of the tumor. We suggest that these symptoms are a consequence of changes in brain activity resulting from tumor-induced biochemical and molecular alterations. This is a paradigm shift that could have important implications for diagnosis and therapeutic strategies.”

Liset Menéndez de la Prida, director of the Laboratory of Neural Circuits at the Cajal Institute (CSIC), says: “Using machine learning, we have been able to integrate all the data to create a model that allows us to know whether there is or not metastasis in a brain, just by looking at its electrical activity. This computational approach may even be able to predict subtypes of brain metastases at an early stage. It is a completely pioneering work that opens up an unexplored path.”

Both authors emphasize the multidisciplinary nature of this complex study that combines neuroscience, oncology, and computational analysis, each using a wide range of different techniques.

Cognitive study of patients and development of non-invasive techniques

The change in focus brought about by this result means that researchers now want to analyze the cognitive status of patients with brain metastasis much more systematically.

For Valiente, this is one of the most important next steps. The key to this will be the National Brain Metastasis Network (RENACER) initiated and coordinated by CNIO, which has already served to generate the largest collection of living brain metastasis samples in the world (with prior consent from patients, tissue samples collected during surgical interventions are made available to the international scientific community in the CNIO Biobank), and in which they will now introduce protocols for the neurocognitive assessment of the participating patients.

For her part, Liset Menéndez de La Prida will work on integrating the recording of brain activity with the analysis of the molecules involved, “in order to develop new diagnostic probes for brain tumors,” she says. This task is in line with the European NanoBRIGHT project, which aims to develop non-invasive techniques for studying the brain and treating its pathologies, and in which CSIC and CNIO are participating.

Another goal is to find drugs that protect the brain from cancer-induced disruptions in neuronal circuits, using the strategies described above. “We will look for molecules involved in metastasis-induced changes in neuronal communication, and evaluate them as possible therapeutic targets,” explains Valiente.

In addition to the artificial intelligence developed by the CSIC team, they will use the METPlatform technology designed by CNIO to evaluate the potential therapeutic activity of hundreds of compounds simultaneously on brain tissue samples affected by metastasis.

The results of this study highlight the potential of machine learning to transform the way we comprehend and cure brain tumors. By identifying the ways in which tumors disrupt the communication between neurons, scientists have paved the way for more effective treatments that target the root causes of the disease. This research showcases the power of science and technology in enhancing the lives of those impacted by brain tumors, and it serves as an inspiration for future research that will continue to push the boundaries of what is feasible.