Pxhere.com
Pxhere.com

Dutch scientists use AI for predicting calving problems before insemination

A small percentage of cows will experience problems when calving and breeders would like to know which cows are at risk. Using the vast dataset of the Dutch cattle breeding company CRV, computer scientists at the University of Groningen used artificial intelligence to develop a predictive model that in theory could halve the number of calving problems. Pxhere.com

Cattle breeding is data science. Breeding firms provide semen from bulls and register the success of their offspring. Data on the milk yield of the cows and many other characteristics are collected and stored in a vast database, together with the genetic data from all the animals. This allows the companies to attribute an ‘estimated breeding value’ to the animals and find matches for optimal breeding.

Risk

One aspect of breeding is the birthing of calves. In about 3.3 percent of all cases, some kind of complication occurs during calving, which is referred to as dystocia. "This could range from the calf needing to be pulled to needing veterinary intervention," explains Ahmad Alsahaf. "There are models to predict the risk of dystocia, but these work with data only available after insemination. We wanted to produce a model that could predict the risk before insemination."

Alsahaf now works as a postdoctoral researcher at the Department of Biomedical Sciences of Cells & Systems of the University Medical Center Groningen. Still, he has worked on a predictive model for dystocia during his Ph.D. project at the Intelligent Systems research group at the Bernoulli Institute for Mathematics, Computer Science, and Artificial Intelligence at the University of Groningen in The Netherlands.

Challenges

"We were asked to create this model for the cattle breeding company CRV and they gave us a large dataset comprising information on cows and bulls," says Alsahaf. ‘We first used a machine-learning system to analyze the data and create a provisional model. Then, we checked if the most important risk factors made sense. They did and, therefore, we proceeded to build a full model."

There were two main challenges: the first was to clean up and compile the available data. The second was that only 3.3 percent of pregnant cows experience dystocia. "This meant that there was a huge imbalance in our dataset," explains Alsahaf. To solve this, he created a large number of subsets with balanced data and aggregated those to train the predictive model. "Subsequently, we tested this model on a subset of the data that was not used for training and studied the results." It turned out that the model performed significantly better than the chance.

"A colleague of ours calculated that, under ideal circumstances, our model could roughly halve the risk of dystocia. But this requires an ideal combination of bull and cow, which is not always possible." Nevertheless, the model can help farmers and the breeding company to assess the risk of a particular mating before insemination. "This is important since, so far, all other models require information gathered after insemination, which means you are not really preventing complications."

Credit: Carlos Padilla
Credit: Carlos Padilla

ALMA successfully restarts observations after cyberattack

Forty-eight days after suspending observations due to a cyberattack, the Atacama Large Millimeter/submillimeter Array (ALMA) is observing the sky again. The computing staff has worked diligently to rebuild the affected JAO computer system servers and services. This is a crucial milestone in the recovery process. 

On 29 October, ALMA suffered a cyberattack. The computing staff took immediate countermeasures to avoid loss and damage to scientific data and IT infrastructure. The attack affected various critical operational servers and computers. 

“The challenge was to securely restore all the communication and computer systems as quickly as possible. We established an aggressive plan that required coordination with the ALMA partnership worldwide,” explains Jorge Ibsen, Head of the ALMA Computing Department. “Thanks to the active engagement of everyone in the partnership worldwide, especially the Computing, Engineering, and Science Operations staff, and the cybersecurity experts from ESO, NAOJ, and NRAO, we managed to be observing as planned.” 

In the coming weeks, the focus will be on recovering testing infrastructure and systems like the ALMA website and other services, which will allow the recovery of all the functionalities existing before the cyberattack. 

ALMA Director, Sean Dougherty, celebrates that: “It is fantastic to be back doing science observations once again! It has been an enormous challenge to rebuild our systems to return to observing securely. Thanks to everyone at the JAO and across the ALMA partnership for attaining this impressive milestone.” 

China performs hydrodynamic simulations for exploring progenitor system of type Ia supernova

Ph.D. candidate CUI Yingzhen and Prof. MENG Xiangcun from the Yunnan Observatories of the Chinese Academy of Sciences (CAS) performed hydrodynamic simulations on the common-envelope wind model of type Ia supernovae (SNe Ia) and revealed the mass loss mechanism and the main observational features of white dwarf binaries in the common-envelope wind phase. 

The study was published in Astronomy & Astrophysics. 

SNe Ia supernovae are some of the most energetic events in the Universe. They are used as cosmological distance indicators, which have led to the discovery of the accelerating expansion of the Universe. 

One of the most popular progenitor models of SNe Ia is the single-degenerate model, in which a carbon-oxygen white dwarf accretes material from a non-degenerate companion star to increase its mass, and eventually undergoes a thermonuclear explosion. The problem with this model is that when the mass transfer rate exceeds a certain critical value, the accreted envelope of the white dwarf expands and eventually forms a common envelope around the binary system, which may prevent the occurrence of SNe Ia. 

The common-envelope wind model is a modified single-degenerate model that can in principle address the above-mentioned problem by suggesting a strong mass loss at the surface of the common envelope. However, it is not clear how the mass loss at the surface of the common envelope arises and what the observational characteristics of such systems are. 

The researchers carried out detailed hydrodynamic simulations of the common-envelope wind model and found that such systems are always dynamically unstable and consequently produce dramatic mass loss, resulting in an envelope mass of only a few thousand of solar mass. 

By analyzing the internal structure, they found that this instability was driven by ionization-recombination processes of hydrogen and helium in the envelope, the same mechanism as the pulsating excitation of classical Cepheids. In the Hertzsprung-Russell diagram, the center of the evolutionary trajectory of the common-envelope wind model was also located within the classical Cepheid instability strip, implying that this system may appear as periodic variable stars. 

This result can provide theoretical guidance for the subsequent observational search for the progenitor system of SNe Ia. 

Brown bypasses the need for massive data sets with the combo of an ML, active learning techniques

When it comes to predicting disasters brought on by extreme events (think earthquakes, pandemics, or “rogue waves” that could destroy coastal structures), computational modeling faces an almost insurmountable challenge: Statistically speaking, these events are so rare that there is just not enough data on them to use predictive models to accurately forecast when they’ll happen next.

But a team of researchers from Brown University and Massachusetts Institute of Technology says it doesn’t have to be that way.

In a new study, the scientists describe how they combined statistical algorithms — which need fewer data to make accurate, efficient predictions — with a powerful machine learning technique developed at Brown and trained it to predict scenarios, probabilities, and sometimes even the timeline of rare events despite the lack of a historical record on them.

Doing so, the research team found that this new framework can provide a way to circumvent the need for massive amounts of data that are traditionally needed for these kinds of computations, instead essentially boiling down the grand challenge of predicting rare events to a matter of quality over quantity.

“You have to realize that these are stochastic events,” said George Karniadakis, a professor of applied mathematics and engineering at Brown and a study author. “An outburst of a pandemic like COVID-19, environmental disaster in the Gulf of Mexico, an earthquake, huge wildfires in California, a 30-meter wave that capsizes a ship — these are rare events and because they are rare, we don't have a lot of historical data. We don't have enough samples from the past to predict them further into the future. The question that we tackle in the paper is: What is the best possible data that we can use to minimize the number of data points we need?”

The researchers found the answer in a sequential sampling technique called active learning. These types of statistical algorithms are not only able to analyze data input into them, but more importantly, they can learn from the information to label new relevant data points that are equally or even more important to the outcome that’s being calculated. At the most basic level, they allow more to be done with less.

That’s critical to the machine learning model the researchers used in the study. Called DeepOnet, the model is a type of artificial neural network, which uses interconnected nodes in successive layers that roughly mimic the connections made by neurons in the human brain. DeepOnet is known as a deep neural operator. It’s more advanced and powerful than typical artificial neural networks because it’s actually two neural networks in one, processing data in two parallel networks. This allows it to analyze giant sets of data and scenarios at breakneck speed to spit out equally massive sets of probabilities once it learns what it’s looking for.

The bottleneck with this powerful tool, especially as it relates to rare events, is that deep neural operators need tons of data to be trained to make calculations that are effective and accurate.

In the paper, the research team shows that combined with active learning techniques, the DeepOnet model can get trained on what parameters or precursors to look for that lead up to the disastrous event someone is analyzing, even when there are not many data points.

“The thrust is not to take every possible data and put it into the system, but to proactively look for events that will signify the rare events,” Karniadakis said. “We may not have many examples of the real event, but we may have those precursors. Through mathematics, we identify them, which together with real events will help us to train this data-hungry operator.”

In the paper, the researchers apply the approach to pinpointing parameters and different ranges of probabilities for dangerous spikes during a pandemic, finding and predicting rogue waves, and estimating when a ship will crack in half due to stress. For example, with rogue waves — ones that are greater than twice the size of surrounding waves — the researchers found they could discover and quantify when rogue waves will form by looking at probable wave conditions that nonlinearly interact over time, leading to waves sometimes three times their original size.

The researchers found their new method outperformed more traditional modeling efforts, and they believe it presents a framework that can efficiently discover and predict all kinds of rare events.

In the paper, the research team outlines how scientists should design future experiments so that they can minimize costs and increase the forecasting accuracy. Karniadakis, for example, is already working with environmental scientists to use the novel method to forecast climate events, such as hurricanes.

The study was led by Ethan Pickering and Themistoklis Sapsis from MIT. DeepOnet was introduced in 2019 by Karniadakis and other Brown researchers. They are currently seeking a patent for the technology. The study was supported with funding from the Defense Advanced Research Projects Agency, the Air Force Research Laboratory, and the Office of Naval Research.

 A depiction of the Novatron nuclear fusion reactor. (Image: Novatron Fusion Group AB)
A depiction of the Novatron nuclear fusion reactor. (Image: Novatron Fusion Group AB)

Sweden increases investment in fusion energy

KTH Royal Institute of Technology a public university in Stockholm, Sweden has announced that it will make a joint investment in fusion power with Novatron Fusion Group AB and EIT InnoEnergy. The purpose of the work is to evaluate new technology intended to stabilize fusion plasma—matter that exceeds 100 million degrees Celsius.  Lisa Ericsson, Head of KTH Innovation

Handling fusion plasma poses a key challenge before fusion power can be realized as a stable and sustainable energy solution. Neither steel nor any other material can withstand the material’s extreme heat.

In France, the ITER fusion reactor stabilizes plasma through a process involving powerful magnets and advanced controls. The KTH-Novatron-EIT InnoEnergy collaboration however will evaluate a solution in which the plasma is naturally kept stable. More fusion scientists and a large number of trained fusion energy engineers will be required.

“We believe that this international collaboration is an important prerequisite for the university’s success,” says Lisa Ericsson, head of KTH Innovation. Ericsson is one of the KTH teams involved in the collaboration around the new technology, which KTH alumnus Jan Jäderberg is developing in the newly-formed company, Novatron Fusion Group.

Peter Roos, CEO of Novatron Fusion Group says: “We are proud to work in partnership with KTH Royal Institute of Technology and EIT InnoEnergy on fusion power as the sustainable energy source for the future. It highlights the scientific importance of the Novatron concept, and it will provide us with further access to state-of-the-art technology, leading research in plasma physics, and supercomputer simulations capabilities.”

The collaboration’s aim is to enable fusion on a large commercial scale so that it can be developed into a sustainable energy solution for the future when the need for electricity is expected to continue growing.

“It is important to point out that the technology faces experimental verification,” says Stefan Östlund, vice president for Global Relations at KTH and professor of Electric Power Engineering.

Östlund says that if the technology meets expectations, it could produce fusion energy more simply and cheaply than other solutions currently being tested. KTH would then be one of the players involved in making this possible via expanded fusion research and a significant increase in education in fusion technology,

Ericsson says that investment is important from the perspective of both KTH and Sweden. She says that Commonwealth Fusion Systems, a spin-off company at the Massachusetts Institute of Technology (MIT), has contributed to interest in MIT around fusion energy, which has led to the formation of a number of new fusion-related companies and research projects.

“This collaboration strengthens KTH's role both in fusion research and innovation development,” Ericsson says. “If this works, it will lead to the emergence of a completely new global industry starting at KTH.”

Östlund says he is satisfied with the collaboration between KTH and EIT InnoEnergy. For example, he mentions the innovation work via KTH Innovation and education in the form of successful master's programs.

“It has proven to be very rewarding for both parties. The example of Novatron and fusion technology is another collaboration that can develop into something highly valuable,” Östlund says.

Diego Pavia, CEO of EIT InnoEnergy, says: “Transforming a technical breakthrough into a viable and commercial solution requires indeed a range of skills, resources, and expertise that can only be achieved by bringing different partners together. We are pleased to announce this collaboration.”