German researchers use satellite data, AI to determine the land-use intensity of meadows, pastures

Extensively used grassland is host to a high degree of biodiversity, performs an important climate protection function as a carbon sink, and also serves for fodder and food production. However, these ecosystem services are jeopardized if productivity on these lands is maximized and their use therefore intensified. Until now, data on the condition of the meadows and pastures in Germany have been unavailable for larger areas. In the journal Remote Sensing of Environment, researchers at the Helmholtz Centre for Environmental Research (UFZ) have now described how satellite data and machine learning methods enable the assessment of land-use intensity. The maps a-d show depictions of the grassland management regime and respective land-use intensity based on satellite data in the district of Oberallgäu (Bavaria) in 2018 on a 10km×10km area. (a) Grazing classes (0-3; low to high grazing intensity). (b) Frequency of mowing (0-4). (c) Fertilisation (yes/no). (d) Land use intensity index (LUI): The values are grouped into five classes for Germany. Their colours range from green (extensive use) to magenta (intensive use). Photo: UFZ

The Sentinel-2 space mission began with the launch of the Earth observation satellite Sentinel-2A in June 2015, and Sentinel-2B was launched in March 2017. Since then, these two satellites have been orbiting in space at an altitude of nearly 800 kilometers and, as part of the European Space Agency's (ESA) Copernicus program, providing data for, e.g., climate protection and land monitoring. Every three to five days, they record images in the visible and infrared range of the electromagnetic spectrum, which, with a very high resolution of up to 10 meters, provide a strong foundation for detecting features such as changes in vegetation. An interdisciplinary team of researchers from the Helmholtz Centre for Environmental Research (UFZ) has used this freely accessible data to study the land-use intensity of German grasslands for the years 2017 and 2018. According to the Federal Statistical Office, these grasslands cover an area of roughly 4.7 million hectares and hence nearly 30 percent of all agricultural land. "We need more information on the land-use intensity of grasslands to better understand the stability and functioning of our ecosystems. The more intensively grassland is used, the greater the influence on primary production, nitrogen deposition, and resilience to climate changes," says lead author Dr. Maximilian Lange. He is a scientist in the UFZ Department of Remote Sensing, which is embedded in the "Remote Sensing Centre for Earth System Research" jointly funded by the UFZ and the University of Leipzig. 

A prerequisite for the long-term preservation of grassland is underlying management, e.g. cutting or grazing. If left unused, the areas encounter shrub encroachment. But the intensity of grassland management is critical to their ability to provide ecosystem services. However, no Germany-wide data is publicly available on how farmers manage their grasslands. The UFZ scientist has now used the satellite data with a resolution of 20 meters to derive inferences about mowing frequency, grazing intensity of cattle, horses, sheep, and goats, and fertilization in Germany. "The magnitude of these three types of management is critical to the intensity of use," says Lange. He defined mowing frequency classes from 0 (not mowed) to 5 (mowed five times per year) and calculated a grazing intensity from 0 to 3 (heavily grazed) from a mix of livestock numbers, species, and age. For fertilization, he distinguished between fertilized and not fertilized. He combined these three categories to derive an index that indicates the management intensity of a grassland area ranging from "extensive" to "intensive."

He used artificial intelligence (AI) to derive information on the three usage parameters based on the multi-dimensional data the researchers obtained from the satellite images. "AI can very efficiently derive information from data that are too complex for humans to comprehend. Reference data can be used to train machine learning algorithms to identify patterns in the satellite data that we can then evaluate and apply to infer conclusions for large areas," he says. Lange obtained the reference data from the field data of three Biodiversity Exploratories sponsored by the German Research Foundation (DFG) in Hainich, Schorfheide, and the Schwäbische Alb. Various experiments have been ongoing there since 2006 in long-term studies on grassland with different levels of land-use intensity. These experiments investigate topics such as how land use affects biodiversity and the effects of changes in species composition on ecosystem processes.

Lange used two algorithms to evaluate how accurately machine learning recognizes actual grassland use from the satellite data: Random Forest, a standard remote sensing method for classifying land cover, and CNN (Convolutional Neural Networks), a deep learning method primarily used in image processing. The result: "Both methods do a good job of representing reality, and the CNN method is slightly better," he says. With the CNN method, the UFZ researcher was able to approximate the data from the DFG Biodiversity Exploratories, which ranged from 66 to 85 percent (grazing intensity 66 percent, mowing regime 68 percent, fertilization 85 percent) for the example of 2018. Random Forest based results were slightly lower for all three parameters. This is a high classification accuracy for comparable ecological remote sensing studies, but it could be still further improved if more data on grassland use were available. "The more data that can be used to train a deep learning method and the more accurate these data are, the more precise the results will be," says Lange. In a further step, he tested the results’ plausibility in four example regions in Germany. Two of these regions (Oberallgäu and Dithmarschen) are known for their intensive grassland use, while one near the Rhön Biosphere Reserve sees only moderate use and the other, a nature reserve in Saxony-Anhalt, is only used extensively. This comparison also yielded a good match between the remote sensing-based results and the actual data.

Overall, the UFZ team found that grassland was used less intensively in Germany in 2018 than in 2017. "This is primarily due to the drought in 2018 and the associated loss of grassland productivity," says Dr. Daniel Doktor, last author of the publication and head of the UFZ Land Cover & Dynamics Working Group. For example, the calculations show that 64 percent of the grassland was not mown in 2018 while this value was only 36 percent in 2017. "The results also show the management differences across Germany. Management often is very intensive in regions such as Allgäu or Schleswig-Holstein, while it is far more extensive in Brandenburg or parts of Saxony," he says. But this evaluation is only the beginning. More precise management data from further regions of Germany are needed to draw even more precise conclusions with the machine learning algorithms. 

UCF researchers demo 40-channel optical communication link

The silicon-based device could help meet the ever-growing need to move more data faster

Researchers have demonstrated a silicon-based optical communication link that combines two multiplexing technologies to create 40 optical data channels that can simultaneously move data. The new chip-scale optical interconnect can transmit about 400 GB of data per second — the equivalent of about 100,000 streaming movies. This could improve data-intensive internet applications from video streaming services to high-capacity transactions for the stock market. The researchers designed and optimized a mode-division multiplexer that transforms each of the 10 wavelengths into four new beams that each have different shapes. This fourfold increase in data capacity creates 40 channels.  CREDIT Kiyoul Yang, Stanford University

“As demands to move more information across the internet continue to grow, we need new technologies to push data rates further,” said Peter Delfyett, who led the University of Central Florida College of Optics and Photonics (CREOL) research team. “Because optical interconnects can move more data than their electronic counterparts, our work could enable better and faster data processing in the data centers that form the backbone of the internet.”

A multi-institutional group of researchers describes the new optical communication link in the Optica Publishing Group journal Optics Letters. It achieves 40 channels by combining a frequency comb light source based on a new photonic crystal resonator developed by the National Institute of Standards and Technology (NIST) with an optimized mode-division multiplexer designed by the researchers at Stanford University. Each channel can be used to carry information much like different stereo channels, or frequencies, transmit different music stations.

“We show that these new frequency combs can be used in fully integrated optical interconnects,” said Chinmay Shirpurkar, co-first author of the paper. “All the photonic components were made from silicon-based material, which demonstrates the potential for making optical information handling devices from low-cost, easy-to-manufacture optical interconnects.”

In addition to improving internet data transmission, the new technology could also be used to make faster optical computers that could provide the high levels of supercomputing power needed for artificial intelligence, machine learning, large-scale emulation, and other applications.

Using multiple light dimensions

The new work involved research teams led by Firooz Aflatouni of the University of Pennsylvania, Scott B. Papp from NIST, Jelena Vuckovic from Stanford University, and Delfyett from CREOL. It is part of the DARPA Photonics in the Package for Extreme Scalability (PIPES) program, which aims to use light to vastly improve the digital connectivity of packaged integrated circuits using micro comb-based light sources.

The researchers created the optical link using tantalum pentoxide (Ta2O5) waveguides on a silicon substrate fabricated into a ring with a nanopatterned oscillation on the inner wall. The resulting photonic crystal micro-ring resonator turns a laser input into ten different wavelengths. They also designed and optimized a mode-division multiplexer that transforms each wavelength into four new beams that each have different shapes. Adding this spatial dimension enables a fourfold increase in data capacity, creating 40 channels. Researchers demonstrated a silicon-based optical communication link that combines two multiplexing technologies to create 40 optical data channels. The ring-shaped photonic crystal resonator (left) features a nanopattern inside (right) that splits a selected resonant mode for comb generation. Images taken with scanning electron microscopy.  CREDIT Su-Peng Yu, NIST

Once the data is encoded onto each beam shape and each beam color, the light is recombined back into a single beam and transmitted to its destination. At the final destination, the wavelengths and beam shapes are separated so that each channel can be received and detected independently, without interference from the other transmitted channels.

“An advantage of our link is that the photonic crystal resonator enables easier soliton generation and a flatter comb spectrum than those demonstrated with conventional ring resonators,” said co-first author Jizhao Zang from NIST. “These features are beneficial for optical data links.”

Better performance with inverse design

The researchers used a computational nanophotonic design approach called photonic inverse-design to optimize the mode division multiplexer. This method provides a more efficient way to explore a full range of possible designs while offering smaller footprints, better efficiencies, and new functionalities.

“The photonic inverse-design approach makes our link highly customizable to meet the needs of specific applications,” said co-first author Kiyoul Yang from Stanford University.

Tests of the new device matched well with simulations and showed that the channels exhibited low crosstalk of less than -20 dB. Using less than −10 dBm of received optical receiver power, the link performed error-free data transmission in 34 out of the 40 channels using a PRBS31 pattern, a standard used to test high-speed circuits under stress.

The researchers are now working to further improve the device by incorporating photonic crystal micro-ring resonators that produce more wavelengths or by using more complex beam shapes. Commercializing these devices would require the full integration of a transmitter and receiver chip with high bandwidth, low power consumption, and a small footprint. This could enable the next generation of optical interconnects for use in data-center networks.

NASA engineers built Webb to endure micrometeoroid impacts

Micrometeoroid strikes are an unavoidable aspect of operating any spacecraft, which routinely sustain many impacts throughout long and productive science missions in space. Between May 23 and 25, NASA’s James Webb Space Telescope sustained an impact on one of its primary mirror segments. After initial assessments, the team found the telescope is still performing at a level that exceeds all mission requirements despite a marginally detectable effect in the data. 

Thorough analysis and measurements are ongoing. Impacts will continue to occur throughout the entirety of Webb’s lifetime in space; such events were anticipated when building and testing the mirror on the ground. After a successful launch, deployment, and telescope alignment, Webb’s beginning-of-life performance is still well above expectations, and the observatory is fully capable of performing the science it was designed to achieve.

Webb’s mirror was engineered to withstand bombardment from the micrometeoroid environment at its orbit around Sun-Earth L2 of dust-sized particles flying at extreme velocities. While the telescope was being built, engineers used a mixture of simulations and actual test impacts on mirror samples to get a clearer idea of how to fortify the observatory for operation in orbit. This most recent impact was larger than was modeled, and beyond what the team could have tested on the ground. Artist's rendition of NASA's James Webb Space Telescope

“We always knew that Webb would have to weather the space environment, which includes harsh ultraviolet light and charged particles from the Sun, cosmic rays from exotic sources in the galaxy, and occasional strikes by micrometeoroids within our solar system,” said Paul Geithner, technical deputy project manager at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “We designed and built Webb with performance margin – optical, thermal, electrical, mechanical – to ensure it can perform its ambitious science mission even after many years in space.” For example, due to careful work by the launch site teams, Webb’s optics were kept cleaner than required while on the ground; their pristine cleanliness improves the overall reflectivity and throughput, thereby improving total sensitivity. This and other performance margins make Webb’s science capabilities robust to potential degradations over time.

Furthermore, Webb’s capability to sense and adjust mirror positions enables partial correction for the result of impacts. By adjusting the position of the affected segment, engineers can cancel out a portion of the distortion. This minimizes the effect of any impact, although not all of the degradation can be canceled out this way. Engineers have already performed the first such adjustment for the recently affected segment C3, and additional planned mirror adjustments will continue to fine-tune this correction. These steps will be repeated when needed in response to future events as part of the monitoring and maintenance of the telescope throughout the mission.

To protect Webb in orbit, flight teams can use protective maneuvers that intentionally turn the optics away from known meteor showers before they are set to occur. This most recent hit was not a result of a meteor shower and is currently considered an unavoidable chance event. As a result of this impact, a specialized team of engineers has been formed to look at ways to mitigate the effects of further micrometeoroid hits of this scale. Over time, the team will collect invaluable data and work with micrometeoroid prediction experts at NASA’s Marshall Space Flight Center to be able to better predict how performance may change, bearing in mind that the telescope’s initial performance is better than expected. Webb’s tremendous size and sensitivity make it a highly sensitive detector of micrometeorites; over time Webb will help improve knowledge of the solar system dust particle environment at L2, for this and future missions.

“With Webb’s mirrors exposed to space, we expected that occasional micrometeoroid impacts would gracefully degrade telescope performance over time,” said Lee Feinberg, Webb optical telescope element manager at NASA Goddard. “Since launch, we have had four smaller measurable micrometeoroid strikes that were consistent with expectations and this one more recently that is larger than our degradation predictions assumed. We will use this flight data to update our analysis of performance over time and also develop operational approaches to assure we maximize the imaging performance of Webb to the best extent possible for many years to come.”