University of Copenhagen prof develops algo that reveals the mysterious foraging habits of narwhals

An algorithm can predict when narwhals hunt - a task once nearly impossible to gain insight into. Mathematicians and computer scientists at the University of Copenhagen, together with marine biologists in Greenland, have made progress in gathering knowledge about this enigmatic Arctic whale at a time when climate change is pressuring them.

The small whale, known for its distinctively spiraled tusk, is under mounting pressure due to warming waters and the subsequent increase in Arctic shipping traffic. To better care for narwhals, we need to learn more about their foraging behavior - and how these may change as a result of human disturbances and global warming. Biologists know almost nothing about this. Because narwhals live in isolated Arctic regions and hunt at depths of up to 1,000 meters, it is very difficult - sometimes impossible - to gain any insight whatsoever. Researchers tagging a narwhal in East Greeenland.  CREDIT Carsten Egevang.

Ironically, artificial intelligence may be the answer to the mystery of their natural behaviors. An interdisciplinary collaboration between mathematicians, computer scientists, and marine biologists from the University of Copenhagen and the Greenland Institute of Natural Resources demonstrates that algorithms can be used to map the foraging behavior of this enigmatic whale.

"We have shown that our algorithm can actually predict that when narwhals emit certain sounds, they are hunting prey. This opens up entirely new insights into the life of narwhals," explains Susanne Ditlevsen, a professor at UCPH's Department of Mathematical Sciences who has helped marine biologists in Greenland with the processing of data for several years.

"It is crucial to gain more insight into where and when narwhals hunt for food as sea ice recedes. If they are disturbed by shipping traffic, it matters whether this is in the middle of an important foraging area. Finding out, however, is incredibly difficult. Here, artificial intelligence seems to be able to make a huge difference and to a great extent, provide us with the knowledge that could not otherwise have been obtained," says cetacean researcher Mads Peter Heide-Jørgensen, a professor at the Greenland Institute of Natural Resources and adjunct professor at the University of Copenhagen. He adds:

"In a situation where narwhals are in deep water, in the middle of the Bay of Baffin during December, we currently have no way of finding out where or when they are foraging. Here, artificial intelligence seems to be the way forward."

The algorithm maps clicks and buzzes

Until now, the best way to learn about the hunting patterns of narwhals has been to collect acoustic data using measuring instruments attached to their bodies. Like bats, narwhals orient themselves using echolocation. By making clicking sounds, they explore their environment and orient themselves. As they begin to hunt, these clicks shorten in intervals to become buzzing.

While the buzzing sounds are therefore interesting to researchers, it is impossible to collect acoustic data in many places. Furthermore, recording these sounds is highly data-intensive and time-consuming to analyze manually.

As a result, the researchers set out to investigate whether, by using artificial intelligence, they could find a pattern in the way whales move and the buzzes they emit. In the future, this would make it possible for them to rely only on measurements of animal movements using an accelerometer, a simple to use technology familiar to us from our smartphones.

"The major challenge was that these whales have very complex movement patterns, which can be tough to analyze. This becomes possible only with the use of deep learning, which could learn to recognize both the various swimming patterns of whales as well as their buzzing sounds. The algorithm then discovered connections between the two," explains Assistant Professor Raghavendra Selvan of the Department of Computer Science.

The researchers trained the algorithm using large quantities of data collected from five narwhals in Scoresby Sound fjord in East Greenland.

Now, the researchers hope to add to the algorithm by characterizing different types of buzzing sounds to identify the precise buzzing sounds that lead to a catch. This can be achieved by collecting data in which biologists give whales a temperature pill that detects temperature drops in their stomachs as they consume cold fish or squid.

Purdue engineers propose an energy-efficient method for using 100 percent outdoor air in buildings

By now, it’s well known that circulating outdoor air in buildings is safer than recirculating indoor air. That point was driven home by the pandemic. Problem is, it’s just not cost-effective.

That may soon change. Purdue University engineers have proposed a system that combines new membrane technology with the latest HVAC systems to make 100% outdoor air systems more energy-efficient and economically feasible – especially in warm, humid climates. They say their system could save up to 66% in energy costs for large buildings that choose to use the safer outdoor air. Using 100% outdoor air in an HVAC system is safer, but much more costly and energy intensive. Purdue University engineers (left to right) David Warsinger, Andrew Fix, and James Braun are working on an energy-efficient system that uses membranes to dehumidify outdoor air before it enters the HVAC system. (Purdue University photo/Jared Pike)

Previous research at Purdue has shown that HVAC systems (heating, ventilating, and air conditioning) are a key factor in spreading airborne diseases like COVID in indoor environments like office buildings, restaurants, and airplanes.

“Most people don’t realize the complexity of a modern HVAC system,” said James E. Braun, the Herrick Professor of Engineering and director of the Center for High-Performance Buildings at Purdue. “There’s a specific sweet spot for humidity in an indoor environment — between 40% and 60%. Any drier than that, and people aren’t comfortable; any more humid, and you’re at risk for mold and other problems.”

So, simply opening windows is not a solution.

“If you introduce outdoor air, the humidity levels of a building can fluctuate wildly. It’s an incredible challenge to maintain the right balance between temperature, humidity, human comfort, and overall cost.”

In a typical HVAC system, Braun says, almost 40% of the energy is used to dehumidify the air. That makes the heating or cooling of outdoor air even more energy-intensive and costly.

To solve this problem, Braun teamed up with David Warsinger, assistant professor of mechanical engineering, who specializes in using membranes for water filtration and desalination. They have proposed a system called the Active Membrane Energy Exchanger, which integrates specialized membranes into the HVAC system to reduce the energy required to dehumidify the outside air. Large buildings like hospitals could reduce their energy costs up to 66% with such a system, compared to current fully outdoor air systems.

Their research has been published in Applied Energy.

“The membrane is the key,” said Andrew Fix, a Purdue doctoral student in mechanical engineering and lead author of the paper. We use membranes that are vapor selective, meaning they only allow water vapor to pass through when a pressure difference is applied but block air. By passing the air over these membranes, we can pull water vapor out of the air, reducing the load on the motors and compressors that run the refrigeration cooling cycle.”

To gauge the system’s effectiveness, the team used supercomputer models created by Pacific Northwest National Laboratory of hospitals in different climate conditions. Hospitals are ideal test beds because they are large indoor environments, which often require a higher percentage outdoor air in their HVAC systems for safety purposes. The models showed an overall reduction in energy usage for all locations using the Active Membrane system. The more hot and humid locations – Tampa, Houston and New Orleans – showed the greatest energy savings.

“The more hot and humid it gets, the better our system works,” Fix said. “This is a key finding, because as the climate continues to warm around the world, locations that want to use 100% outdoor air will now be able to economically afford it.”

The researchers are working toward building a physical prototype to validate their computer models. But there’s now more at stake than simply saving energy.

“I think COVID was a wake-up call for all of us,” Fix said. “Heating and cooling our buildings is not just a matter of temperature and humidity, but it can actually be a matter of life and death. Hopefully, this work will help to make all of our indoor spaces safer.”

Germany's Helmholtz AI seeks a faster pathway to synthetic data

Helmholtz Association funds project for data acquisition using neural networks

In addition to experimentally generated data, fundamental research in physics also works with synthetically generated data. Acquisition of such data with currently available simulation methods is, however, time-consuming and ties up immense supercomputer capacity. A new project by the Deutsches Elektronen-Synchrotron (DESY, Hamburg), the Helmholtz-Zentrum Dresden-Rossendorf (HZDR), and the Center for Advanced Systems Understanding (CASUS, Görlitz) at the HZDR, is testing an approach with which data on physical systems behavior can be generated more quickly using neural networks. The "SynRap" project was selected for funding through a competitive process. The fifteen winning teams are to receive a total of 6.2 million euros over the next few years to carry out their projects through the Helmholtz Association's Artificial Intelligence Cooperation Unit Helmholtz AI. An event recorded with a particle detector, from which ten particle streams (known as "jets", represented as orange cones), a certain elementary particle (muon, red line) and additional elementary particles (yellow lines) have emerged. This visualization is based on data measured with the particle detector. In addition to experimental data, simulated or synthetic data are also commonly used in high energy physics. For the correct generation of synthetic data sets, all physical boundary conditions must be taken into account, which makes this task computationally intensive. The use of neural networks could considerably accelerate the production of synthetic data sets. With large amounts of synthetic data generated without great effort, scientists could in the future test more hypotheses on the standard model of elementary particle physics and beyond this model. The illustration here contains only a few of the detector's elements like the inner layers (blue) as well as the muon chambers (gray). The muon chambers are anchored in the steel yoke (red). The yoke guides the strong magnetic fields necessary for precise measurement of the particles.

Synthetic data are data generated by computer algorithms. They are used, for example, to test software or to pass on anonymized personal data. The training of machine learning algorithms is another important field of application. Here, synthetic data are needed primarily for training deep neural networks-a specific type of computer algorithms from the domain of artificial intelligence. Dr. Isabell Melzer-Pellmann, group leader at DESY, explains, "These algorithms must be trained with particularly large data sets in order to produce accurate results when analyzing experimental data." Currently, these training data are created with complex numerical simulation methods from the field of quantum mechanics. These methods are, however, computationally intensive and consume a great deal of computation time.

A faster alternative will now be studied within the project "SynRap - Machine-learning based synthetic data generation for rapid physics modeling." The aim of Melzer-Pellmann and her colleagues Dr. Dirk Krücker from DESY, Dr. Attila Cangi from CASUS, and Dr. Nico Hoffmann from the Institute of Radiation Physics at HZDR is to accelerate the process of producing large amounts of synthetic data by a factor of one thousand. To do so, the team plans to compile a toolbox of machine learning algorithms suitable for this purpose. These algorithms stem from a particular subset of neural networks. In contrast to deep neural networks, they are called surrogate neural networks.

The quality of the data sets created in this way will be assessed by applications from the research areas of high energy density physics and high energy physics. High energy density physics focuses on phenomena in the interior of planets and stars as well as on applied research, such as the processing of materials with powerful lasers. High energy physics tackles fundamental questions concerning the nature of our universe: What is a matter made of? What laws determine interactions of the components of matter?

One toolbox - many areas of application

"The unique feature of our project is that our toolbox of various neural networks should ultimately find use in many research areas," explains Cangi. This is why the needs of other natural sciences will also be taken into consideration when developing the software tools. "The interdisciplinary nature of CASUS enables us to orient our work on typical user scenarios from the environmental sciences, systems biology, and others," adds Cangi.

The Helmholtz Artificial Intelligence Cooperation Unit (Helmholtz AI) strengthens the application and development of applied artificial intelligence and machine learning. Within the Helmholtz AI competition, a panel of experts selected in particular those research projects that promise a high gain in insight. However, such projects are also considered especially risky. There is a good chance that unsolvable problems will arise and the outlined project objective will not be achieved. The Helmholtz Association is awarding a total of 6.2 million euros in the current funding cycle. During the first Helmholtz AI call a year ago, nineteen projects were awarded a total of 7.2 million euros.