UK researchers develop AI algo that detects brain abnormalities could help cure epilepsy

An artificial intelligence (AI) algorithm that can detect subtle brain abnormalities which cause epileptic seizures has been developed by researchers at UCL in London, the United Kingdom.

The Multicentre Epilepsy Lesion Detection project (MELD) used over 1,000 patient MRI scans from 22 global epilepsy centers to develop the algorithm, which provides reports of where abnormalities are in cases of drug-resistant focal cortical dysplasia (FCD) – a leading cause of epilepsy.

FCDs are areas of the brain that have developed abnormally and often cause drug-resistant epilepsy. It is typically treated with surgery, however identifying the lesions from an MRI is an ongoing challenge for clinicians, as MRI scans in FCDs can look normal.

To develop the algorithm, the team quantified cortical features from the MRI scans, such as how thick or folded the cortex/brain surface was and used around 300,000 locations across the brain.

Researchers then trained the algorithm on examples labeled by expert radiologists as either being a healthy brain or having FCD – dependent on their patterns and features.

The findings, published in Brain, found that overall the algorithm was able to detect the FCD in 67% of cases in the cohort (538 participants).

Previously, 178 of the participants had been considered MRI negative, which means that radiologists had been unable to find the abnormality – yet the MELD algorithm was able to identify the FCD in 63% of these cases.

This is particularly important as if doctors can find the abnormality in the brain scan, then surgery to remove it can provide a cure.

Co-first author, Mathilde Ripart (UCL Great Ormond Street Institute of Child Health) said: “We put an emphasis on creating an AI algorithm that was interpretable and could help doctors make decisions. Showing doctors how the MELD algorithm made its predictions was an essential part of that process.”

Co-senior author, Dr. Konrad Wagstyl (UCL Queen Square Institute of Neurology) added: "This algorithm could help to find more of these hidden lesions in children and adults with epilepsy, and enable more patients with epilepsy to be considered for brain surgery that could cure epilepsy and improve their cognitive development. Roughly 440 children per year could benefit from epilepsy surgery in England."

Around 1% of the world’s population has the serious neurological condition epilepsy, which is characterized by frequent seizures.

In the UK some 600,000 people are affected. While drug treatments are available for the majority of people with epilepsy, 20-30% do not respond to medications.

In children who have had surgery to control their epilepsy, FCD is the most common cause, and in adults, it is the third most common cause.

Additionally, of patients who have epilepsy that has an abnormality in the brain that cannot be found on MRI scans, FCD is the most common cause.

Co-first author, Dr. Hannah Spitzer (Helmholtz Munich) said: “Our algorithm automatically learns to detect lesions from thousands of MRI scans of patients. It can reliably detect lesions of different types, shapes and sizes, and even many of those lesions that were previously missed by radiologists.”

Co-senior author, Dr. Sophie Adler (UCL Great Ormond Street Institute of Child Health) added: “We hope that this technology will help to identify epilepsy-causing abnormalities that are currently being missed. Ultimately it could enable more people with epilepsy to have potentially curative brain surgery.”

This study on FCD detection uses the largest MRI cohort of FCDs to date, meaning it can detect all types of FCD.

The MELD FCD classifier tool can be run on any patient with a suspicion of having an FCD who is over the age of 3 years and has an MRI scan.

The MELD project is supported by the Rosetrees Trust.

Rensselaer wins $250k grant to research ML algos, improve complex predictions

Dr. Yangyang Xu, assistant professor of mathematical sciences at Rensselaer Polytechnic Institute, has received a $250,000 grant from the National Science Foundation (NSF) to research challenges associated with distributed big data in machine learning. Artificial Intelligence digital concept illustrate of modern internet technology and innovative processes 3D rendering

Machine learning algorithms allow computers to make decisions, predictions, and recommendations based on input training data without being explicitly told what information to look for in the data. This technique has been broadly used ever since data mining was envisioned, but its potential has not been fully realized yet. For instance, marketers use machine learning to provide shoppers with product recommendations, photo apps use it for facial recognition, and mapping and traffic apps use it to estimate commute times, but identifying highly complex relationships requires much more data and computing power.

Deep learning is machine learning on a larger scale, involving the input of massive amounts of data and the formulation of increasingly complex predictions. With vast amounts of data, the use of multiple networked computers is necessary: a distributed system. However, computational and mathematical challenges arise. Xu and his team, which includes undergraduate and graduate students, will use the NSF grant to address some of these challenges. 

Simply put, Xu’s team will develop groundbreaking algorithms that allow multiple computers to work efficiently together as one. They will also focus on maintaining the security of distributed personal information, and on methods to improve the speed and accuracy of deep learning. Decentralized algorithms will also be developed for solving optimization problems containing conditions restricting the behavior of the intelligent agents involved.

“The main goal is to design optimization algorithms that have fast convergence and low communication cost for solving large-scale distributed machine learning problems,” said Xu. “A few stochastic gradient-type methods will be designed for solving a few classes of problems by exploiting their structures. These algorithms will incorporate several features including acceleration technique to have fast convergence, compression technique to have efficient communication, and asynchronous computing to have high parallelization speed up.”

“Dr. Xu’s research will not only advance the scope and applicability of large-scale machine learning technology, but it will offer exceptional opportunities for Rensselaer’s graduate and undergraduate students,” said Curt M. Breneman, Dean of the School of Science. “At Rensselaer, undergraduate students are offered hands-on, project-based research opportunities early on in collaboration with seasoned graduate students and faculty, and this experience makes all the difference in terms of their big-picture thinking and future employability. Through this grant, Dr. Xu’s students will be able to contribute to a widely used, cutting-edge technology.”

Brazilian researchers show superconductivity at higher temperature than usual

Certain materials at very low temperatures conduct electric current without resistance or losses. This property, known as superconductivity, was discovered in 1911 by Dutch physicist Heike Kamerlingh Onnes (1853-1926), who won the 1913 Nobel Prize in Physics for his research in the field. Discovery by Brazilian researchers featured on cover of the journal Nanoscale is noteworthy because of possible applications in next-generation electronic devices  CREDIT Daniel Rana Camarotto / Desayuno

Even though more than a century has passed since its discovery, research on superconductivity is still intense, owing both to the amount of information it provides about fundamental aspects of material reality and to its practical applications, in energy conversion, telecommunications, and medical diagnostic imaging, for example. 

One research line relates to the so-called “superconductive transition temperature” (Tc), below which a material becomes a superconductor. The importance of this topic is easy to understand, given the significance of obtaining superconductivity at ever-higher temperatures, I.e., as close as possible to room temperature.

A study with this focus by Brazilian researchers has recently been published as a cover feature in the journal Nanoscale. The article begins with a reference to the “great interest” in the topic owing to "possible applications in next-generation electronic devices”.

“In a previous study, our research group investigated the role played by pressure as a variable capable of modifying the transition temperature of a certain material. In the case of two-dimensional materials, an analogous process is obtained by the application of strains. That’s what our latest study is about,” said Edison Zacarias da Silva, a professor at the State University of Campinas’s Gleb Wataghin Institute of Physics (IFGW-UNICAMP) in São Paulo State, and principal investigator for the study.

Silva is a senior researcher for a Thematic Project supported by FAPESP. The study used a new supercomputer called Ada Lovelace at the National Center for High-Performance Processing (CENAPAD-SP), hosted by UNICAMP. The Center for Development of Functional Materials (CDMF) also collaborated. CDMF is one of FAPESP’s Research, Innovation, and Dissemination Centers (RIDCs).

In the study, the researchers used supercomputer simulations to investigate the superconducting behavior of a dimolybdenum nitride (Mo2N) monolayer at different temperatures and with the application of varying strains. The mathematical tool used to resolve the electronic structure of the material was density functional theory (DFT), a simplified model derived from quantum mechanics. 

In DFT, used in solid-state physics and theoretical chemistry to resolve many-body systems, the properties of systems with many electrons are determined using functionals (functions of functions) — in this case, the spatial distribution of electron density.

“Analysis of electron-phonon coupling enables us to detect the formation of Cooper pairs, which characterize a superconducting state,” Silva said. 

A phonon is a mechanical excitation that propagates through the crystal lattice of a solid. In classical physics, it can be described as an elastic wave, but considering that the phenomenon occurs on the atomic scale, it is necessary to use quantum physics, in which case a phonon should be thought of as a quantum of energy that travels through the lattice. 

Electron-phonon interaction generates an effective attraction between electrons, which leads to electron pairing, or the formation of Cooper pairs. Discovered by Leon Cooper, winner of the 1972 Nobel Prize in Physics, Cooper pairs flow together through the material without energy dissipation, resulting in superconductivity.

“We found that dimolybdenum nitride has a striking property, which is that it’s an electride and at the same time displays superconductivity at relatively high temperatures. Owing to their ionic nature, electrides have pockets of electrons confined in voids in the crystal, whereas superconductors, depending on the temperature, offer no resistance to the flow of electrons. Although these two properties appear to be opposites, they could coexist in the same material. That’s exactly what we showed in our study,” said Zenner Pereira, a professor at the Federal Rural University of the Semi-Arid (Ufersa) in Rio Grande do Norte State, and the first author of the Nanoscale article.

An important finding of the study was the strong correlation between the electronic properties of the material and the strain applied. “Our simulations also showed that the Mo2N monolayer became superconductive at the highest temperature of any material in its class at ambient pressure. The transition temperature ranged from 19.3 kelvin to 24.8 kelvin, depending on the strain,” Silva said.

Besides Silva and Pereira, Giovani Faccin, a professor at the Federal University of Greater Dourados (UFGD) in Mato Grosso do Sul, also participated in the study.