Dartmouth Computer scientist receives $1.5 million to build new AI approaches to lung cancer

A $1.5 million National Cancer Institute grant to Computer Scientist Saeed Hassanpour of Dartmouth's Norris Cotton Cancer Center will be used to build new artificial intelligence (AI) technologies for precision cancer care in lung cancer management

Lung cancer is the second most common type of cancer and the leading cause of cancer death in men and women, with non-small cell lung cancer (NSCLC) accounting for up to 90% of cases. Somatic mutations heavily impact the sensitivity of NSCLC patients to various drug treatments and are critical for choosing the most effective targeted therapies for this cancer. Most NSCLC patients develop resistance to their targeted therapies during the first year of treatment. The reason for resistance is still unknown.

"Currently, there is no computational method to link information from medical records to somatic mutations and targeted therapy responses," says Saeed Hassanpour, Ph.D., a computer scientist at Dartmouth's Norris Cotton Cancer Center. Hassanpour has received a 4-year $1.5M grant from the National Cancer Institute to build and validate machine learning approaches that can reveal relationships between clinical and pathologic findings, patient genetic profiles and drug resistance. Linking these data could mean better, personalized treatment strategies for NSCLC patients. Saeed Hassanpour with his research lab and collaborators at Dartmouth's Norris Cotton Cancer Center will use a $1.5M National Cancer Institute grant to build new AI technologies for precision cancer care in lung cancer management.{module In-article}

"Our overall objective is to use pathology reports of NSCLC tumors and available data from electronic medical records to build computational models for identifying NSCLC patients with clinically-actionable somatic mutations and predicting their responses to targeted therapies," says Hassanpour. "We think that pathological findings of NSCLC cells and tissues, in combination with relevant information in medical records, such as medical and family history, demographics and smoking status, will be reliable indicators to achieve this objective."

Hassanpour's team will test their hypothesis by building and validating novel information-extraction and machine learning approaches that leverage textual information to identify statistically significant connections. The results will then be used to identify NSCLC patients with clinically-actionable mutations and predict their resistance to targeted therapies.

Revealing these relationships with the use of a well-designed bioinformatics approach will allow Hassanpour's clinical collaborators to better understand how NSCLC tumors develop and respond to treatment. "We expect our machine learning methods to identify NSCLC patients with clinically-actionable mutations based on tumor pathology reports and EMR data, and to provide an accurate, fast, and inexpensive pre-selection method that can be utilized before performing time-intensive and expensive DNA sequencing to find the same mutations," explains Hassanpour. "As a result, this prediction method is expected to prioritize DNA screening of NSCLC patients who are the most likely to have clinically-actionable mutations, thus reducing screening turnaround time and increasing the accuracy of treatment administration."

In addition, the team's pre-selection method will improve the finding and tracking of NSCLC patients with clinically-actionable mutations for translational research, help with recruitment of NSCLC patients for clinical trials, assist care providers in selecting the best treatment strategies, improve survival outcomes for NSCLC patients, and extend quality of life. "In precision cancer care, even identifying the high likelihood of resistance to targeted therapy has important implications on the choice of the 'best' treatment strategy for NSCLC patients and their responsiveness," says Hassanpour. "Our application will have a meaningful, positive impact on public health and the promotion of precision medicine."

Collaborators on this project include Laura Tafe, MD, Associate Professor of Pathology, Department of Pathology and Laboratory Medicine at Dartmouth-Hitchcock; Gregory Tsongalis, PhD, Professor of Pathology and Director of Molecular Pathology and Clinical Genomics and Advanced Technologies at Dartmouth-Hitchcock, and Co-Director of the Translational Research Program and Pathology Shared Resource at Dartmouth's Norris Cotton Cancer Center; Konstantin Dragnev, MD, Professor of Medicine and Associate Director for Clinical Research at Dartmouth's Norris Cotton Cancer Center; and external collaborators from the University of Vermont Medical Center and Baylor College of Medicine.

Purdue wins $2.3 million grant to advance ethanol fuel research

Grant supports mathematical modeling of biomass movement in large fermentation facilities

Imagine trying to quickly turn a damp piece of plywood into a liquid and squeezing it through several small openings. That's essentially the challenge facing biorefineries trying to turn corn waste and related materials into ethanol fuel.

A big challenge in biomass processing, such as turning waste into ethanol fuel, is the difficulty in moving the biomass to, within and through the equipment needed to physically and chemically treat the biomass as part of the fuel production process. {module In-article} 

CAPTION Discrete element method (DEM) supercomputer models have been developed for describing biomass transport through a compression feed screw. This image shows mathematical simulation of biomass particles in feed screw at steady state.Researchers at Purdue University have received $2.3 million in funding from the Department of Energy's Bioenergy Technologies Office for their work to create supercomputer models to simplify the design and construction of biorefineries to help them better perform reliably, sustainably, safely and economically.

"We have basically used fundamental theories, particle properties, and measured bulk characteristics to develop and verify computational tools for biorefineries that are taking a material like corn stalks, sugar cane bagasse, or sawdust and making them flow like a liquid," said Michael Ladisch, Distinguished Professor of Agricultural and Biological Engineering at Purdue, who leads the research team and has been studying the topic of cellulose conversion and pretreatment for more than 25 years.

Ladisch said the team has created predictive analytical models that rigorously represent flow performance of biomass materials to define the conditions for robust operation and minimal downtime due to plugging problems from moving the materials within and between reactors.

Other members of the Purdue team include Carl Wassgren, a professor of mechanical engineering; Arezoo Ardekani, an associate professor of mechanical engineering; Pankaj Sharma, managing director of the Integrative Data Science Initiative; Eduardo Ximenes, a senior research scientist at Purdue's Laboratory of Renewable Resources Engineering; Kendra Erk, an associate professor of material engineering; Nathan Mosier, a professor of agricultural and biological engineering; and Kingsly Ambrose and Abigail Engelberth, both associate professors of agricultural and biological engineering.

The latest innovation and the predictive models come on top of other patented technologies created by the Purdue team and patented through the Purdue Research Foundation Office of Technology Commercialization. For more information on licensing the Purdue innovations, contact D.H.R. Sarma at the Office of Technology Commercialization at dhrsarma@prf.org.

The analytical models address on machinery and operating conditions developed at several locations, including Purdue, with the latest grant going to address techniques previously developed and patented at Purdue based on technology that avoids chemical agents in the production of ethanol fuel. A combination of water and enzymes are used to break down the waste materials into forms usable in the production of biofuels.

"We are a small piece of the bioenergy puzzle, but we believe our work is part of the larger progress to reduce carbon emissions and help farmers," Ladisch said.

Swiss researcher uses a cavity that leads to strong interaction between light, matter

Researchers have succeeded in creating an efficient quantum-mechanical light-matter interface using a microscopic cavity. Within this cavity, a single photon is emitted and absorbed up to 10 times by an artificial atom. This opens up new prospects for quantum technology, report physicists at the University of Basel and Ruhr-University Bochum in the journal Nature.

Quantum physics describes photons as light particles. Achieving an interaction between a single photon and a single atom is a huge challenge due to the tiny size of the atom. However, sending the photon past the atom several times by means of mirrors significantly increases the probability of interaction.

In order to generate photons, the researchers use artificial atoms, known as quantum dots. These semiconductor structures consist of an accumulation of tens of thousands of atoms, but behave much like a single atom: when they are optically excited, their energy state changes and they emit a photon. "However, they have the technological advantage that they can be embedded in a semiconductor chip," says Dr. Daniel Najer, who conducted the experiment at the Department of Physics at the University of Basel. CAPTION A microscopic cavity of two highly reflective mirrors is used to allow an enclosed artificial atom (known as a quantum dot) to interact with a single photon. A photon is emitted and reabsorbed up to 10 times by the quantum dot before it is lost. The quantum dot is electrically controlled within a semiconductor chip.{module In-article}

System of quantum dot and microcavity

Normally, these light particles fly off in all directions like a light bulb. For their experiment, however, the researchers positioned the quantum dot in a cavity with reflective walls. The curved mirrors reflect the emitted photon back and forth up to 10,000 times, causing an interaction between light and matter.

Measurements show that a single photon is emitted and absorbed up to 10 times by the quantum dot. At the quantum level, the photon is transformed into a higher energy state of the artificial atom, at which point a new photon is created. And this happens very quickly, which is very desirable in terms of quantum technological applications: one cycle lasts just 200 picoseconds.

The conversion of an energy quantum from a quantum dot to a photon and back again is theoretically well supported, but "nobody has ever observed these oscillations so clearly before," says Professor Richard J. Warburton from the Department of Physics at the University of Basel.

Serial interaction of light and matter

The successful experiment is particularly significant because there are no direct photon-photon interactions in nature. However, controlled interaction is required for use in quantum information processing.

By transforming light into matter according to the laws of quantum physics, and interaction between individual photons becomes indirectly possible - namely, via the detour of an entanglement between a photon and a single electron spin trapped in the quantum dot. If several such photons are involved, quantum gates can be created through entangled photons. This is a vital step in the generation of photonic qubits, which can store information by means of the quantum state of light particles and transmit them over long distances.

International collaboration

The experiment takes place in the optical frequency range and places high technical demands on the size of the cavity, which must be adapted to the wavelength, and the reflectivity of the mirrors, so that the photon remains in the cavity for as long as possible.