ADVA launches sustainable supplier program to tackle CO2 emissions

ADVA has launched its sustainable supplier program as part of its ongoing commitment to radically reduce greenhouse gas emissions. The initiative extends ADVA’s holistic sustainability strategy upstream in its supply chain and is a key step towards the company’s production processes becoming completely carbon neutral. Launched in cooperation with ADVA’s finance platform partner Traxpay, the program involves ADVA offering financial incentives to its suppliers who meet strict criteria for minimizing environmental impact. The scheme also strengthens ADVA’s supply chain at a time of unprecedented logistical challenges, including material shortages and the global semiconductor crisis.

“The sustainable supplier program is another milestone in expanding our environmental, social, and governance (ESG) activities and ensuring environmentally friendly supply chain management. Everyone needs to play their part to fight climate change and so we’re rewarding our suppliers who share our dedication to taking urgent action now,” said Klaus Grobe, director, global sustainability, ADVA. “Our commitment to setting and meeting strict emissions targets has made ADVA one of the world’s leading systems vendors when it comes to sustainability. Now we’re also incentivizing our suppliers to prioritize the future and meet the highest standards in the industry for carbon reduction.” Steven Williams, Director, Treasury and Investor Relations, ADVA

ADVA’s sustainable supplier program is part of its commitments based on the most ambitious goals of the COP26 Glasgow Climate Pact: limiting global temperature increase to 1.5°C above pre-industrial levels. The company was one of the first telecommunication technology suppliers to have targets approved by the Science Based Targets Initiative (SBTi) and the criteria for its new supplier program are in line with ADVA’s own sustainability goals. These are in turn checked by customers, including regular assessments with the TIA Assessor tool and EcoVadis ratings. To evaluate suppliers’ sustainability activities ADVA will leverage IntegrityNext software. ADVA will also work with its partner NORD/LB, which will ensure financial security by providing additional liquidity for the program.

“With our new initiative, we’re empowering our suppliers to make a significant difference. Our sustainable supplier program motivates them to reduce the environmental impact of their business and helps them achieve sustainable success. It will also make a decisive contribution to maintaining robust supply chains, especially in the current era of global material shortages,” commented Steven Williams, director, treasury and investor relations, ADVA. “Since 2017, ADVA has held a Gold TIA rating, and last year we achieved our first EcoVadis Platinum rating. Now, we’re widening the scope of our efforts. Alongside our partners Traxpay and NORD/LB, we’re rewarding sustainable suppliers for their ESG activities and encouraging other companies to help tackle this most urgent challenge.”

New ice-sheet modeling provides critical insights into ice mass loss in Antarctica

After the natural warming that followed the last Ice Age, there were repeated periods when masses of icebergs broke off from Antarctica into the Southern Ocean. A new data-model study led by the University of Bonn (Germany) now shows that it took only a decade to initiate this tipping point in the climate system, and that ice mass loss then continued for many centuries. Accompanying modeling studies suggest that today's accelerating Antarctic ice mass loss also represents such a tipping point, which could lead to irreversible and long-lasting ice retreat and global sea-level rise.  Iceberg - Iceberg in Antarctica © Uni Bonn/ Michael Weber

To understand what the consequences of current and future human-induced climate warming may be, it helps to take a look at the past: how did sea-level changes look like during times of natural climate warming? In a recent study, an international research team led by Dr. Michael Weber from the Institute of Geosciences at the University of Bonn investigated this question. In doing so, they focused on the Antarctic Ice Sheet as the largest remaining ice sheet on Earth.

There, they searched for evidence of icebergs that broke off the Antarctic continent, floated in the surrounding ocean, and melted down in the major gateway to lower latitudes called “Iceberg Alley”. In the process, the icebergs released encapsulated debris that accumulated on the ocean floor. The team took sediment cores from the deep ocean in 3.5 km water depth from the area, dated the natural climate archive, and counted the ice-rafted debris.

The scientists identified eight phases with high amounts of debris which they interpret as retreat phases of the Antarctic Ice Sheet after the Last Glacial Maximum about 19,000 to 9,000 years ago when the climate warmed and Antarctica shed masses of icebergs repeatedly into the ocean. The result of the new data-model study: each such phase destabilized the ice sheet within a decade and contributed to global sea-level rise for centuries to a millennium. The subsequent re-stabilization was equally rapidly within a decade.

The research team found three other independent pieces of evidence for such post-glacial tipping points: Model experiments showing the melting of the entire Antarctic ice sheet, a West Antarctic ice core documenting ice-sheet elevation draw-down, and drill cores revealing a step-wise ice-sheet retreat across the Ross Sea shelf.

Today's ice mass loss could be the start of a long-lasting period

The results are also relevant for ice retreat observed today: “Our findings are consistent with a growing body of evidence suggesting the acceleration of Antarctic ice-mass loss in recent decades may mark the begin of a self-sustaining and irreversible period of ice sheet retreat and substantial global sea-level rise”, says study leader Dr. Michael Weber from the University of Bonn.

Combining the sediment record with supercomputer models of ice sheet behavior the team showed that each episode of increased iceberg calving reflected an increased loss of ice from the interior of the ice sheet, not just changes in the already-floating ice shelves. “We found that iceberg calving events on multi-year time scales were synchronous with discharge of grounded ice from the Antarctic Ice Sheet”, said Prof. Nick Golledge from the University of Wellington (New Zealand), who led the ice-sheet supercomputer modeling.

Dr. Zoë Thomas, a co-author of the study from the University of New South Wales in Sydney, Australia, then applied statistical methods to the model outputs to see if early warning signs could be detected for tipping points in the ice sheet system. Her analyses confirmed that tipping points did indeed exist. “If it just takes one decade to tip a system like this, that’s actually quite scary because if the Antarctic Ice Sheet behaves in the future like it did in the past, we must be experiencing the tipping right now”, Thomas said.

According to Weber, “Our findings are consistent with a growing body of evidence suggesting the acceleration of Antarctic ice-mass loss in recent decades may mark the begin of a self-sustaining and irreversible period of ice sheet retreat and substantial global sea-level rise”. When we might see the eventual stabilization of the ice sheet is unknown because it will depend significantly on how much future climate warming occurs.

Texas A&M prof finds late season storms to have greater potential for intensifying than early season storms

Texas A&M Oceanographer and Assistant Professor, Dr. Henry Potter, gathered evidence suggesting that tropical storms in the late hurricane season have a better chance of intensifying than early season storms. In his new research, Potter explains that differences in upper ocean temperatures between the two times of year are key to cyclone strength and longevity. 

“My interest was trying to understand variability in upper ocean heat throughout hurricane season,” Potter said.  Getty Images

Tropical cyclones get their energy by sucking the heat from the warm upper waters of the ocean. Throughout the season from June to November, temperatures vary both at and below the surface which can influence how strong a storm can become. Sea surface temperatures can be tracked by satellite, but not subsurface temperatures, making intensity predictions more difficult.

The most intense storms tend to occur during the peak months of August and September. However, two nearly identical storms that occur in June and November could behave very differently because of these differences in upper ocean temperatures.

The early summer months have a thinner layer of hot water in the upper ocean and the later months of the season have a much deeper layer of warm water for storms to draw energy from. As a storm draws heat from the upper layer, it also draws up the cooler waters from below, cooling the upper waters and thus reducing the surface heat energy available for intensification. In the later months of the season, the cooler waters are further down making it harder for the storm to cool the upper waters and increasing the possibility of storm intensification.

For this study, Potter collected ten years of data from the Argo program, part of the Global Ocean Observing System. The Argo program consists of about 4000 drifting floats globally to measure profiles at Each measuring water temperature from the surface to 2000 m every 10 days. The number of operational floats changes daily and about 20-30 floats are operating in the Gulf of Mexico at any given time. Dr. Potter used the temperature profiles to calculate the tropical storm heat potential.

“The tropical cyclone heat potential is a metric used by the hurricane community that helps improve hurricane forecasts when used in addition to sea surface temperature,” Potter said. “It is important to know what the temperature profile is below the surface, so we have a better idea of how much the ocean is likely to cool due to the hurricane mixing it.”

“It is a very timely work. The role of Upper Ocean Heat Content in tropical cyclone intensification is an important scientific and societal subject,” said Dr. Anthony Knap, Oceanography professor and Director of Texas A&M’s Geochemical & Environmental Research Group (GERG). “Although storm track forecasting has been significantly improved in the past 30 years, long-term intensity forecasts have lagged.”

“I have had a long-term interest in this problem. GERG has a series of gliders which are deployed to help fill the gaps,” Knap said. “However, we need more support to keep these gliders in the water pre-hurricane season in the Gulf of Mexico.”

“Hurricanes do a lot of damage and upend a lot of people’s lives in very serious ways,” Potter said. “One of the best things we can do, as a community of hurricane scientists, is to produce reliable forecasts so that people heed evacuation warnings.”

Stuttgart physicists develop a new supercomputer simulation for describing the attachment of a liquid to a surface

Liquids containing ions or polar molecules are ubiquitous in many applications needed for green technologies such as energy storage, electrochemistry, or catalysis. When such liquids are brought to an interface such as an electrode – or even confined in a porous material –  they exhibit unexpected behavior that goes beyond the effects already known. Recent experiments have shown that the properties of the employed material, which can be insulating or metallic, strongly influence the thermodynamic and dynamic behavior of these fluids. To shed more light on these effects, physicists at the University of Stuttgart, Université Grenoble Alpes, and Sorbonne Université Paris have developed a novel supercomputer simulation strategy using a virtual fluid that allows the electrostatic interactions within any material to be taken into account while being computationally sufficiently efficient to study the properties of fluids at such interfaces. The new method now made it possible for the first time to study the wetting transition at the nanoscale. This depends on whether the ionic liquid encounters a material that has insulating or metallic properties. This breakthrough approach provides a new theoretical framework for predicting the unusual behavior of charged liquids, especially in contact with nanoporous metallic structures, and has direct applications in the fields of energy storage and the environment. Schematic representation of an imperfect metal on which ions and their smeared-out mirror charges are shown Photo: University of Stuttgart / Alexander Schlaich

Despite their key role in physics, chemistry, and biology, the behavior of ionic or dipolar liquids near surfaces – such as a porous material – remains puzzling in many respects. One of the greatest challenges in the theoretical description of such systems is the complexity of the electrostatic interactions. For example, an ion in a perfect metal produces an inverse counter-charge, which corresponds to the negative mirror image. In contrast, no such image charges are induced in a perfect insulator because there are no freely moving electrons. However, any real, i.e., non-idealized material has properties that lie exactly between the two previously mentioned asymptotes. Accordingly, the metallic or insulating nature of the material is expected to have a significant influence on the properties of the adjacent fluid. However, established theoretical approaches reach their limits here, since they assume either perfectly metallic or perfectly insulating materials. To date, there is a gap in the description when it comes to explaining the observed surface properties of real materials in which the mirror charges are sufficiently smeared out.

In their recent investigation, Dr. Alexander Schlaich from the University of Stuttgart et al. presents a new atomic-scale simulation method that allows describing the adsorption of a liquid to a surface while explicitly considering the electron distribution in the metallic material. While common methods consider surfaces made of an insulating material or a perfect metal, they have developed a method that mimics the effects of electrostatic shielding caused by any material between these two extremes. The essential point of this approach is to describe the Coulombic interactions in the metallic material by a "virtual" fluid composed of light and fast charged particles. These create electrostatic shielding by reorganizing in the presence of the fluid. This strategy is particularly easy to implement in any standard atomistic simulation environment and can be easily transferred. In particular, this approach allows the calculation of the capacitive behavior of realistic systems as used in energy storage applications. As part of the SimTech cluster of excellence at the University of Stuttgart, Alexander Schlaich is using such simulations of porous, conductive electrode materials to optimize the efficiency of the next generation of supercapacitors, which can store enormous power density. The wetting behavior of aqueous salt solutions in realistic porous materials is also the focus of his contribution to the Stuttgart Collaborative Research Center 1313 "Interface-driven multi-field processes in porous media – flow, transport, and deformation," which also investigates precipitation and evaporation processes related to soil salinization. The developed methodology is thus relevant for a wide range of systems, as well as for further research at the University of Stuttgart.

CMCC Foundation explores the ML potential for climate change risk assessment

Large amounts of data and new methods and technologies with which to analyze them. The new frontier of machine learning, a branch of artificial intelligence, at the service of climate studies, in research by the CMCC Foundation and Ca’ Foscari University of Venice

Global warming is exacerbating weather and climate extreme events. The interaction between different forms of hazards triggered by climate change will cause future cross-sectoral impacts affecting a variety of natural and human systems.

Research can improve the understanding of these interactions and dynamics, support decision-makers in managing current and future climate change risks, also thanks to an improved ability to predict expected risks and quantify their impacts.

To this end, in recent years, the scientific community has started testing new methodological approaches, technologies, and tools, among which the application of machine learning, which can help exploit the potential of large amounts and variety of environmental monitoring data available today (big data).

What are the results of the exponential increase in the application of machine learning methods for the assessment of climate-induced risks?

In the study “Exploring machine learning potential for climate change risk assessment“, a team of scientists from the CMCC Foundation and Ca’ Foscari University of Venice conducted an in-depth review of more than 1,200 articles on the subject, published in the last 20 years, highlighting the potential and limitations of machine learning in this field.

“Machine learning is a branch of artificial intelligence,” explains Federica Zennaro, a researcher at the CMCC Foundation and Ca’ Foscari University Venice and the main author of the study. “By simulating the processes of the human brain, certain mathematical algorithms can understand the relationships between a set of input data in order to predict the required output. In our research, we identified that floods and landslides are the most analyzed events through machine learning models, probably because they are the most relevant and frequent around the world.”

Moreover, the study reveals that machine learning has two major potentials that make it particularly interesting when applied to this field of study.

The first is that said algorithms can learn from data: the more data, the better algorithms learn. Thanks to its ability to analyze and process large amounts of data, machine learning allows researchers to disentangle complex relationships underlying the functioning of socio-ecological systems, exploiting the big data collected from various sources, including sensors for environmental analysis at high temporal frequency, social media, satellite data and images, and drones.

The second is that they can combine different types of data, thus enabling an assessment of the risk extent whilst taking into account all its dimensions. These include not only the triggering hazard (for example, an increase in rainfall) but also the vulnerability and exposure of the socio-economic system at stake, which are crucial factors in an evaluation of overall impacts

“For example, consider a model that is trained with detailed data on flood events over the past 20 years, including their location and information on the affected context (urban or natural). This model can project, in a scenario characterized by future climate conditions, what the probability of an event happening at a certain point will be, and calculate its risk of causing harmful impacts to society and the environment,” Zennaro explains. “Machine learning represents the future of risk assessment, but its great potential is not yet widely exploited. Our research shows that there are still few studies that use these models to develop long-term future risk scenarios (up to 2100). The vast majority of studies focus on the short term, probably influenced by the reduced availability of extended time series data capable of supporting adequate model training for long-term projections.”

The next step, explains the co-author Elisa Furlan, a researcher at the CMCC Foundation and Ca’Foscari University Venice, is to develop machine learning models that are increasingly efficient at studying and untangling the complex spatiotemporal interrelationships among different climatic, environmental, and socioeconomic variables, thereby improving understanding of the behavior of complex systems. “Under the perspective of a rising abundance of data and machine learning models’ complexity, researchers will have the possibility (and duty) to improve the understanding of climate-related risks, with the main aim of providing accurate and sound multi-risk scenarios able to drive robust adaptation planning and disaster risk reduction and management”.