Artificial intelligence could be 'game changer' in detecting, managing Alzheimer's disease

The study introduces machine learning as a new tactic in assessing cognitive brain health and patient care

Worldwide, about 44 million people are living with Alzheimer's disease (AD) or a related form of dementia. Although 82 percent of seniors in the United States says it's important to have their thinking or memory checked, only 16 percent say they receive regular cognitive assessments.

Many traditional memory assessment tools are widely available to health professionals, though deficiencies in screening and detection accuracy and reliability remain prevalent. But even with the increasingly favorable instrument MemTrax, a very simple online memory test using images recognition, the clinical efficacy of this new approach as a memory function screening tool has not been sufficiently demonstrated or validated. In practice, there are numerous integrated and complex factors to consider in interpreting memory evaluation test results, which presents a real challenge for clinicians. All these factors stand as a collective barrier to suitably addressing the growing and widespread prevalence of AD and those affected by the disease. A team of researchers at Florida Atlantic University's College of Engineering and Computer Science, SIVOTEC Analytics, HAPPYneuron, MemTrax, and Stanford University School of Medicine introduce supervised machine learning as a modern approach and new value-added complementary tool in cognitive brain health assessment and related patient care and management. {module In-article}

Could artificial intelligence be the solution for testing and managing this complex human health condition? A team of researchers at Florida Atlantic University's College of Engineering and Computer Science, SIVOTEC Analytics, HAPPYneuron, MemTrax, and Stanford University School of Medicine, think so and put their theory to the test.

The researchers employed a novel application of supervised machine learning and predictive modeling to demonstrate and validate the cross-sectional utility of MemTrax as a clinical decision support screening tool for assessing cognitive impairment.

Results of the study, published in the Journal of Alzheimer's Disease, introduce supervised machine learning as a modern approach and new value-added complementary tool in cognitive brain health assessment and related patient care and management.

Findings demonstrate the potential valid clinical utility of MemTrax, administered as part of the online Continuous Recognition Tasks (M-CRT) test, in screening for variations in cognitive brain health. Notably, a comparison of MemTrax to the recognized and widely utilized Montreal Cognitive Assessment Estimation of mild cognitive impairment underscores the power and potential of this new online tool and approach in evaluating short-term memory in diagnostic support for cognitive screening and assessment with a variety of clinical conditions and impairments including dementia.

"Machine learning has an inherent capacity to reveal meaningful patterns and insights from a large, complex inter-dependent array of clinical determinants and the ability to continue to 'learn' from ongoing utility of practical predictive models," said Taghi Khoshgoftaar, Ph.D., co-author and Motorola Professor in FAU's Department of Computer and Electrical Engineering and Computer Science. "Seamless use and real-time interpretation will enhance case management and patient care through innovative technology and practical and readily usable integrated clinical applications that could be developed into a hand-held device and app."

For the study, the researchers used an existing dataset (18,395) from HAPPYneuron. They examined answers to general health screening questions (addressing memory, sleep quality, medications, and medical conditions affecting thinking), demographic information, and test results from a sample of adults who took the MemTrax (M-CRT) test for episodic-memory screening. MemTrax performance and participant features were used as independent attributes: true positive/negative, percent responses/correct, response time, age, sex, and recent alcohol consumption. For predictive modeling, they used demographic information and test scores to predict the binary classification of the health-related questions (yes/no) and general health status (healthy/unhealthy), based on the screening questions.

"Findings from our study provide an important step in advancing the approach for clinically managing a very complex condition like Alzheimer's disease," said Michael F. Bergeron, Ph.D., senior author and senior vice president of development and applications, SIVOTEC Analytics. "By analyzing a wide array of attributes across multiple domains of the human system and functional behaviors of brain health, informed and strategically directed advanced data mining, supervised machine learning, and robust analytics can be integral, and in fact necessary, for health care providers to detect and anticipate further progression in this disease and myriad other aspects of cognitive impairment."

AD is the sixth leading cause of death in the United States, affecting 5.8 million Americans. According to the Alzheimer's Association, this number is projected to rise to 14 million by 2050. In 2019, AD and other dementias will cost the nation $290 billion. By 2050, these costs could rise as high as $1.1 trillion.

"With its widespread prevalence and escalating incidence and public health burden, it is imperative to ensure that the tools clinicians use for testing and managing Alzheimer's disease and other related cognitive conditions are optimal," said Stella Batalama, Ph.D., dean of FAU's College of Engineering and Computer Science. "Results from this important study provide new insights and discovery that has set the stage for future impactful and significant research."

Tokyo Tech-led study shows how icy outer solar system satellites may have formed

Scientists use sophisticated supercomputer simulations and observations of trans-Neptunian objects to understand the formation of the solar system

Using sophisticated supercomputer simulations and observations, a team led by researchers from the Earth-Life Science Institute (ELSI) at Tokyo Institute of Technology has shown how the so-called trans-Neptunian Objects (or TNOs) may have formed. TNOs, which include the dwarf planet Pluto, are a group of icy and rocky small bodies--smaller than planets but larger than comets--that orbit the Solar System beyond the planet Neptune. TNOs likely formed at the same time as the Solar System, and understanding their origin could provide important clues as to how the entire Solar System originated. The masses of the satellite(s) range from 1/10 to 1/1000 of the corresponding TNOs. For comparison, Earth and Moon are also shown.{module In-article}

Like many solar system bodies, including the Earth, TNOs often have their own satellites, which likely formed early on from collisions among the building blocks of the Solar System. Understanding the origin of TNOs along with their satellites may help understand the origin and early evolution of the entire Solar System. The properties of TNOs and their satellites--for example, their orbital properties, composition and rotation rates--provide a number of clues for understanding their formation. These properties may reflect their formation and collisional history, which in turn may be related to how the orbits of the giant planets Jupiter, Saturn, Neptune, and Uranus changed over time since the Solar System formed.

The New Horizons spacecraft flew by Pluto, the most famous TNO, in 2015. Since then, Pluto and its satellite Charon have attracted a lot of attention from planetary scientists, and many new small satellites around other large TNOs have been found. In fact, all known TNOs larger than 1000 km in diameter are now known to have satellite systems. Interestingly, the range of the estimated mass ratio of these satellites to their host systems ranges from 1/10 to 1/1000, encompassing the Moon-to-Earth mass ratio (~1/80). This may be significant because Earth's Moon and Charon are thought to have formed from a giant impactor. Top panels show snapshots for the satellite-forming giant impact with about 1 km/s of the impact velocity and 75 degree of the impact angle. Bottom panel shows the schematic view for the circularization of the satellite's orbit due to tidal interaction after satellite formation.{module In-article}

To study the formation and evolution of TNO satellite systems, the research team performed more than 400 giant impact simulations and tidal evolution calculations. "This is really hard work," says the study's senior author, Professor Hidenori Genda from the Earth-Life Science Institute (ELSI) at Tokyo Institute of Technology. Other Tokyo Tech team members included Sota Arakawa and Ryuki Hyodo.

The Tokyo Tech study found that the size and orbit of the satellite systems of large TNOs are best explained if they formed from impacts of molten progenitors. They also found that TNOs which are big enough can retain internal heat and remain molten for a span of only a few million years; especially if their internal heat source is short-lived radioactive isotopes such as Aluminum-26, which has also been implicated in the internal heating of the parent bodies of meteorites. Since these progenitors would need to have a high short-lived radionuclide content in order to be molten, these results suggest that TNO-satellite systems formed before the outward migration of the outer planets, including Neptune, or in the first ~ 700 million years of Solar System history.

Previous planet formation theories had suggested the growth of TNOs took much longer than the lifetime of short-lived radionuclides, and thus TNOs must not have been molten when they formed. These scientists found, however, that rapid TNO formation is consistent with recent planet formation studies which suggest TNOs formed via accretion of small solids to pre-existing bodies. The rapid formation of large TNOs is consistent with recent planet formation studies; however, other analyses suggest comets formed well after most short-lived radionuclides had decayed. Thus the authors note that there is still much work to be done to produce a unified model for the origin of the Solar System's planetary bodies. The relationship between the initial eccentricity of the formed satellites and the final eccentricity after 4.5-billion-year tidal evolution are shown for three cases. When planetary bodies are rigid for the whole time (right figure) or they behave as a fluid for the first 1000 years (middle figure), most of the eccentricities were not damped, which is not inconsistent with the observation. When they behave as a fluid for the first > 1 million years, the resultant eccentricities are consistent with the observation.{module In-article}

UConn researchers tap into the way cells communicate

A new technology discovered by UConn School of Dental Medicine researchers records cellular communication in real time - providing a closer look into the dynamics of cell secretion and a greater understanding of how cells repair tissue.

In a study published today in the Proceedings of the National Academy of Sciences, Kshitiz Gupta, an assistant professor (who goes by just his first name), and Yashir Suhail, a postdoctoral fellow, in the Dental School's Department of Biomedical Engineering, unlocked a breakthrough technology platform.

Now for the first time, scientists can record cells communicating in real time, opening the floodgates for new developments in cell therapy and other areas within cell biology.

Cells - like humans - are in constant communication with each other. Whereas humans exchange words, cells deliver and receive messages through secreting proteins and changing their behavior accordingly. When we listen to humans speak to each other, we can understand how words are placed into sentences and how the conversation moves back and forth. When it comes to recording communication between cells, however, the key characteristics of the conversation have been largely unknown until now.

Communication between cells is necessary to maintain most functions in the body and can also help the body properly respond to an external cue - such as an ailment or injury. Current technology only allows broad snapshots of these protein secretions. {module In-article}"This is akin to detecting what words were spoken in a sentence, but not really knowing their placement, the inflection, and tone of the message," says Kshitiz. Prior to the current findings, he adds, understanding of the language of communication between cells has been very limited and did not capture the complexity of messaging involved.

Using a combination of microfluidics and supercomputer modeling, researchers created a platform to record cell messages in depth, uncovering the precise ways in which the words and messages are arranged in these intercellular conversations.

In the study, which was funded by the American Heart Association and the National Cancer Institute, Kshitiz and his team looked at stem cells from bone marrow that can be used to treat myocardial infarction, commonly known as a heart attack. Using the platform, the researchers recorded the proteins that were secreted by these stem cells, and how these secretions changed with time.

The information was used to create a protein cocktail that led to a second discovery - the possibility of aiding an injury without the use of stem cells. Since the researchers recorded in depth the conversations between the stem cells, they were able to copy the stem cells' exact behavior.

Stem cells - the researchers witnessed - are flexible enough to change their behavior depending on the injury present. These cells only act as "Good Samaritans," the researchers discovered, when they see injured tissue.

This information created a way to make a "cell-less" therapy by copying what stem cells do when they see a tissue injury and creating a new protein cocktail that aided in repairing cardiac tissue. The discovery of cell-less therapy can potentially reduce many complications associated with stem cell transplantation in the future.

"The findings solve a fundamental problem afflicting systems biology: measuring how cells communicate with each other," says Suhail. "The platform technology will open new lines of inquiry into research, by providing a unique way to detect how cells talk to each other at a deeper level than what is possible today."