Mayo Clinic uses AI to reduce miss rate of precancerous polyps in colorectal cancer screening

Artificial intelligence reduced by twofold the rate at which precancerous polyps were missed in colorectal cancer screening, reported a team of international researchers led by Mayo Clinic. The study is published in Gastroenterology.

Most colon polyps are harmless, but some overtime develop into colon or rectal cancer, which can be fatal if found in its later stages. Colorectal cancer is the second most deadly cancer in the world, with an estimated 1.9 million cases and 916,000 deaths worldwide in 2020, according to the World Health Organization. A colonoscopy is an exam used to detect changes or abnormalities in the large intestine (colon) and rectum.

Between February 2020 and May 2021, 230 study participants each underwent two back-to-back colonoscopies on the same day at eight hospitals and community clinics in the U.S., U.K., and Italy. One colonoscopy used AI; the other, a standard colonoscopy, did not.

The rate at which precancerous colorectal polyps are missed has been estimated to be 25%. In this study, the miss rate was 15.5% in the group that had the AI colonoscopy first. The miss rate was 32.4 % in the group that had standard colonoscopy first. The AI colonoscopy detected more polyps that were smaller, flatter, and in the proximal and distal colon.

 "Colorectal cancer is almost entirely preventable with proper screening," says senior author Michael B. Wallace, M.D., division chair of gastroenterology and hepatology at Sheikh Shakhbout Medical City in Abu Dhabi, United Arab Emirates, and the Fred C. Andersen Professor of Medicine at Mayo Clinic in Jacksonville, Fla. "Using artificial intelligence to detect colon polyps and potentially save lives is welcome and promising news for patients and their families."

In addition, false-negative rates were 6.8% in the group that had the AI colonoscopy first. It was 29.6% in the group that had standard colonoscopy first. A false-negative result indicates that you do not have a particular condition, when in fact you do.

Thompson shows how changes in vegetation shaped global temperatures over last 10,000 years

Follow the pollen. Records from past plant life tell the real story of global temperatures, according to research from a climate scientist at Washington University in St. Louis, Missouri.

Warmer temperatures brought plants — and then came even warmer temperatures, according to new model simulations published April 15 in Science Advances.

Alexander Thompson, a postdoctoral research associate in earth and planetary sciences in Arts & Sciences, updated simulations from an important climate model to reflect the role of changing vegetation as a key driver of global temperatures over the last 10,000 years. Thompson

Thompson had long been troubled by a problem with models of Earth’s atmospheric temperatures since the last ice age. Too many of these simulations showed temperatures warming consistently over time.

But climate proxy records tell a different story. Many of those sources indicate a marked peak in global temperatures that occurred between 6,000 and 9,000 years ago.

Thompson had a hunch that the models could be overlooking the role of changes in vegetation in favor of impacts from atmospheric carbon dioxide concentrations or ice cover.

“Pollen records suggest a large expansion of vegetation during that time,” Thompson said.

“But previous models only show a limited amount of vegetation growth,” he said. “So, even though some of these other simulations have included dynamic vegetation, it wasn’t nearly enough of a vegetation shift to account for what the pollen records suggest.”

In reality, the changes to vegetative cover were significant.

Early in the Holocene, the current geological epoch, the Sahara Desert in Africa grew greener than today — it was more of a grassland. Other Northern Hemisphere vegetation including the coniferous and deciduous forests in the mid-latitudes and the Arctic also thrived.

Thompson took evidence from pollen records and designed a set of experiments with a climate model known as the Community Earth System Model (CESM), one of the best-regarded models in a wide-ranging class of such models. He ran simulations to account for a range of changes in vegetation that had not been previously considered.

“Expanded vegetation during the Holocene warmed the globe by as much as 1.5 degrees Fahrenheit,” Thompson said. “Our new simulations align closely with paleoclimate proxies. So this is exciting that we can point to Northern Hemisphere vegetation as one potential factor that allows us to resolve the controversial Holocene temperature conundrum.”

Understanding the scale and timing of temperature change throughout the Holocene is important because it is a period of recent history, geologically speaking. The rise of human agriculture and civilization occurred during this time, so many scientists and historians from different disciplines are interested in understanding how early and mid-Holocene climates differed from the present day.

Thompson conducted this research work as a graduate student at the University of Michigan. He is continuing his research in the laboratory of climate scientist Bronwen Konecky at Washington University.

“Overall, our study emphasizes that accounting for vegetation change is critical,” Thompson said. “Projections for future climate change are more likely to produce more trustworthy predictions if they include changes in vegetation.”

Birmingham University’s BlueBEAR predicts pollution from cooking emissions

Organic aerosols – such as those released in cooking – may stay in the atmosphere for several days, because of nanostructures formed by fatty acids as they are released into the air.

By identifying the processes which control how these aerosols are transformed in the atmosphere, scientists will be able to better understand and predict their impact on the environment and the climate.

Experts at the Universities of Birmingham and Bath have used instruments at the Diamond Light Source and the Central Laser Facility, both based at the Harwell Campus in Oxford, to probe the behavior of thin films of oleic acid – and unsaturated fatty acid commonly released when cooking.

In the study, published in Atmospheric Chemistry and Physics, they were able to analyze the particular molecular properties that control how rapidly aerosol emissions can be broken down in the atmosphere.

Then, using a theoretical model combined with experimental data the team was able to predict the number of times aerosols generated from cooking may hang around in the environment.

These types of aerosols have long been associated with poor air quality in urban areas, but their impact on human-made climate change is hard to gauge. That’s because of the diverse range of molecules found within aerosols, and their varying interactions with the environment.

By identifying the nanostructure of molecules emitted during cooking that slows down the break-up of organic aerosols, it becomes possible to model how they are transported and dispersed into the atmosphere.

Lead author Dr. Christian Pfrang, of the University of Birmingham’s School of Geography, Earth and Environmental Sciences, said: “Cooking aerosols account for up to 10 percent of particulate matter (PM) emissions in the UK. Finding accurate ways to predict their behavior will give us much more precise ways to also assess their contribution to climate change.”

Co-author Dr. Adam Squires, of the University of Bath, said: “We’re increasingly finding out how molecules like these fatty acids from cooking can organize themselves into bilayers and other regular shapes and stacks within aerosol droplets that float in the air, and how this completely changes how fast they degrade, how long they persist in the atmosphere, and how they affect pollution and weather.” 

DAWN discovers a dusty compact object bridging galaxies, quasars at cosmic dawn

An international effort led by astrophysicists at the Niels Bohr Institute, University of Copenhagen, and the Technical University of Denmark, has identified a distant object with properties that lie in between those of a galaxy and those of a so-called quasar. The object can be seen as the ancestor of a supermassive black hole, and it was born relatively soon after the Big Bang. Simulations had indicated that such objects would exist, but this is the first actual finding. An international team of astronomers using archival data from the NASA/ESA Hubble Space Telescope and other space- and ground-based observatories have discovered a unique object in the distant, early Universe that is a crucial link between young star-forming galaxies and the earliest supermassive black holes. This object is the first of its kind to be discovered so early in the Universe’s history, and had been lurking unnoticed in one of the best-studied areas of the night sky. The object, which is referred to as GNz7q, is shown here in the centre of the image of the Hubble GOODS-North field.  CREDIT NASA, ESA, G. Illingworth (University of California, Santa Cruz), P. Oesch (University of California, Santa Cruz; Yale University), R. Bouwens and I. Labbé (Leiden University), and the Science Team, S. Fujimoto et al. (Cosmic Dawn Center [DAWN] and University of Copenhagen)

“The discovered object connects two rare populations of celestial objects, namely dusty starbursts and luminous quasars, and thereby provides a new avenue toward understanding the rapid growth of supermassive black holes in the early universe,” says Seiji Fujimoto, a postdoctoral fellow based at the Niels Bohr Institute, University of Copenhagen.

The discovery can be attributed to the Hubble Space Telescope operated jointly by ESA and NASA. With its location in space – undisturbed by weather changes, pollution, etc. – the telescope can gaze further into the depths of the universe than would have been the case on the ground. And in astronomy, looking further equals being able to observe phenomena that took place at earlier cosmic periods – since light and other types of radiation will have traveled longer to reach us.

The newly found object – named GNz7q by the team – was born 750 million years after the Big Bang which is generally accepted as the beginning of the universe as we know it. Since the Big Bang occurred about 13.8 billion years ago, GNz7q origins in an epoch known as “Cosmic Dawn”.

The mystery of supermassive black holes

The discovery is linked to a specific type of quasars. Quasars, also known as quasi-stellar objects, are extremely luminous objects. Images from Hubble and other advanced telescopes have revealed that quasars occur in the centers of galaxies. The host galaxy for GNz7q is intensely star-forming, forming stars at a rate 1,600 times faster than our galaxy, the Milky Way. The stars, in turn, create and heat cosmic dust, making it glow in infrared to the extent that GNz7q’s host is more luminous in dust emission than any other known object at this period of the Cosmic Dawn.

In the most recent years, it has transpired, that luminous quasars are powered by supermassive black holes, with masses ranging from millions to tens of billions of solar masses, surrounded by vast amounts of gas. As the gas falls towards the black hole, it will heat up due to friction which provides an enormous luminous effect.

“Understanding how supermassive black holes form and grow in the early universe has become a major mystery. Theorists have predicted that these black holes undergo an early phase of rapid growth: a dust-reddened compact object emerges from a heavily dust-obscured starburst galaxy, then transitions to an unobscured luminous compact object by expelling the surrounding gas and dust,” explains Associate Professor Gabriel Brammer, Niels Bohr Institute, continuing:

“Although luminous quasars had already been found even at the earliest epochs of the universe, the transition phase of the rapid growth of both the black hole and its star-bursting host had not been found at similar epochs. Moreover, the observed properties are in excellent agreement with the theoretical simulations and suggest that GNz7q is the first example of the transitioning, rapid growth phase of black holes at the dusty star core, an ancestor of the later supermassive black hole.”

Both Seiji Fujimoto and Gabriel Brammer are part of the Cosmic Dawn Center (DAWN), a collaboration between Niels Bohr Institute and DTU Space.

Hiding in plain sight

Curiously, GNz7q was found at the center of an intensely studied sky field known as the Hubble GOODS North field.

“This shows how big discoveries can often be hidden right in front of you,” Gabriel Brammer comments.

Finding GNz7q hiding in plain sight was only possible thanks to the uniquely detailed, multi-wavelength datasets available for GOODS North. Without the richness of data, the object would have been easy to overlook, as it lacks the distinguishing features of quasars in the early universe.

“It’s unlikely that discovering GNz7q within the relatively small GOODS-N survey was just “dumb luck”, but rather that the prevalence of such sources may be significantly higher than previously thought,” Brammer adds.

The team now hopes to systematically search for similar objects using dedicated high-resolution surveys and to take advantage of the NASA/ESA/CSA James Webb Space Telescope.

“Fully characterizing these objects and probing their evolution and underlying physics in much greater detail will become possible with the James Webb Telescope. Once in regular operation, Webb will have the power to decisively determine how common these rapidly growing black holes truly are,” Seiji Fujimoto concludes.

Johns Hopkins researchers build AI that predicts if, when someone will have cardiac arrest

First-of-its-kind survival predictor detects patterns in heart MRIs invisible to the naked eye

A new artificial-intelligence-based approach can predict, significantly more accurately than a doctor, if and when a patient could die of cardiac arrest. The technology, built on raw images of patients’ diseased hearts and patient backgrounds, stands to revolutionize clinical decision-making and increase survival from sudden and lethal cardiac arrhythmias, one of medicine’s deadliest and most puzzling conditions. A first-of-its-kind algorithm, using raw MRI images, can predict if and when a patient will have a lethal episode of heart arrhythmia. It detected high risk in the heart circled in red.

“Sudden cardiac death caused by arrhythmia accounts for as many as 20 percent of all deaths worldwide and we know little about why it’s happening or how to tell who’s at risk,” said senior author Natalia Trayanova, the Murray B. Sachs Professor of Biomedical Engineering and Medicine. “There are patients who may be at low risk of sudden cardiac death getting defibrillators that they might not need and then there are high-risk patients that aren’t getting the treatment they need and could die in the prime of their life. What our algorithm can do is determine who is at risk for cardiac death and when it will occur, allowing doctors to decide exactly what needs to be done.”

The team is the first to use neural networks to build a personalized survival assessment for each patient with heart disease. These risk measures provide with high accuracy the chance for a sudden cardiac death over 10 years, and when it’s most likely to happen.

The deep learning technology is called Survival Study of Cardiac Arrhythmia Risk (SSCAR). The name alludes to cardiac scarring caused by heart disease that often results in lethal arrhythmias and is the key to the algorithm’s predictions.

The team used contrast-enhanced cardiac images that visualize scar distribution from hundreds of real patients at Johns Hopkins Hospital with cardiac scarring to train an algorithm to detect patterns and relationships not visible to the naked eye. Current clinical cardiac image analysis extracts only simple scar features like volume and mass, severely underutilizing what’s demonstrated in this work to be critical data.

“The images carry critical information that doctors haven’t been able to access,” said first author Dan Popescu, a former Johns Hopkins doctoral student. “This scarring can be distributed in different ways and it says something about a patient’s chance for survival. There is information hidden in it.”

The team trained a second neural network to learn from 10 years of standard clinical patient data, 22 factors such as patients’ age, weight, race, and prescription drug use.

The algorithms’ predictions were not only significantly more accurate on every measure than doctors, they were validated in tests with an independent patient cohort from 60 health centers across the United States, with different cardiac histories and different imaging data, suggesting the platform could be adopted anywhere.

“This has the potential to significantly shape clinical decision-making regarding arrhythmia risk and represents an essential step towards bringing patient trajectory prognostication into the age of artificial intelligence,” said Trayanova, co-director of the Alliance for Cardiovascular Diagnostic and Treatment Innovation. “It epitomizes the trend of merging artificial intelligence, engineering, and medicine as the future of healthcare.”

The team is now working to build algorithms now to detect other cardiac diseases. According to Trayanova, the deep-learning concept could be developed for other fields of medicine that rely on a visual diagnosis.

The team from Johns Hopkins also included: Bloomberg Distinguished Professor of Data-Intensive Computation Mauro Maggioni; Julie Shade; Changxin Lai; Konstantino Aronis; and Katherine Wu. Other authors include M. Vinayaga Moorthy and Nancy Cook of Brigham and Women’s Hospital; Daniel Lee of Northwestern University; Alan Kadish of Touro College and University System; David Oyyang and Christine Albert of Cedar-Sinai Medical Center.