University of Arizona scientists outline 10 simple rules for the computational modeling of behavioral data

The guidelines are designed to help researchers avoid many potential pitfalls in the computational modeling of cognitive and neuroscience data

New guidelines for scientists who use computational modeling to analyze behavioral data have been published today in the open-access journal eLife.

The goal of computational modeling in the behavioral sciences is to use precise mathematical models to make better sense of data concerning behaviors. These data often come in the form of choices, but can also include reaction times, eye movements and other behaviors that are easy to observe, and even neural data. The mathematical models consist of equations that link the variables behind the data, such as stimuli and past experiences, to behavior in the immediate future. In this way, computational models provide a kind of hypothesis about how behavior is generated.

"Using computers to simulate and study behavior has revolutionized psychology and neuroscience research," explains co-author Robert Wilson, Assistant Professor in Cognition/Neural Systems and Director of the Neuroscience of Reinforcement Learning Lab at the University of Arizona, US. "Fitting computational models to experimental data allows us to achieve a number of objectives, which can include probing the algorithms underlying behavior and better understanding the effects of drugs, illnesses, and interventions." {module INSIDE STORY}

There are four key uses of computational modeling across the scientific literature, according to Wilson and his co-author Anne Collins, Principal Investigator at the Computational Cognitive Neuroscience (CCN) Lab, part of the Department of Psychology and the Helen Wills Neuroscience Institute at the University of California, Berkeley, US. Each of these practices has its own strengths and weaknesses and can be mishandled in a number of ways, potentially leading to incorrect and misleading conclusions and highlighting the need for them to be carried out responsibly.

To address this need, Wilson and Collins offer their 10 simple rules, designed for both beginners and seasoned researchers, to ensure that computational modeling is used with care and yields meaningful insights on what a model is saying about the mind.

Their rules encompass a number of principles that include: designing effective experiments with computational modeling in mind; generating, simulating, comparing and validating models; extracting variables from models to compare with physiological data; reporting on the analyses; and, finally, advice on the next steps once the reporting is completed.

While these guidelines cover the simplest modeling techniques that can be used by beginners, they are also applicable more generally. Likewise, for clarity, the authors decided to focus on a single narrow domain - reinforcement learning models applied to choice data - as the same techniques used in this domain can be applied more widely to other observable behaviors.

"Our work highlights how to avoid common pitfalls and misinterpretations that can arise with computational modeling," Collins explains. "We learned many of these lessons the hard way, by actually making these mistakes for ourselves over a combined 20-plus years in the field.

"By following these guidelines, we hope other scientists will avoid some of the errors that slowed down our own research," she adds. "We would also hope to start seeing improvements in the quality of computational modeling in the behavioral sciences."

University of São Paulo reseacher presents supercomputer simulations for understanding the transport of aerosols at FAPESP Week France

A study produced in Brazil and presented during FAPESP Week France aims at elucidating the behavior of the so-called aerosols, which have an important influence on climate, agriculture, and human health.

One of the speakers at FAPESP Week France, held in Lyon and Paris until November 27th, Livia Freire, a researcher at the Institute of Mathematical and Computing Sciences of the University of São Paulo (ICMC-USP), has been developing supercomputer simulations to try to broaden the knowledge on the transport of aerosols.

"Our interest lies in knowing how these particles are transported by atmospheric flows, which are very complicated movements to simulate and understand because they are turbulent. We are developing numerical models that simulate flows in the atmosphere and how they transport particles. The aim is to obtain simple equations that researchers in other areas can use to understand particle concentrations in the atmosphere," she said. 

The tiny solids or liquids suspended in the air are called particulate matter. This material is only micrometers (thousandths of millimeters) in size. Particulate matter in the air, whether in the form of dust, fog, or smoke, is called aerosol. The sources of aerosols can be natural or anthropogenic, such as pollution. The smoke that leaves truck exhausts, factory chimneys, or fires in fields or forests is easily visible. Similarly, sand storms can be observed from kilometers away. However, the compounds in these two types of formations usually go unnoticed, and a large amount of the particles that circulate everywhere are unseen. {module INSIDE STORY}

These suspended microscopic particles play an important role in climate and rains and can affect human health. Because of this, they are the object of studies by scientists in various countries, and Brazil is at the front line of this research, with projects such as GoAmazon, with São Paulo Research Foundation - FAPESP's support (read more at agencia.fapesp.br/29665).

"I'm developing a study funded by FAPESP whose aim is to understand the behavior of the particles that circulate in the atmosphere. These are various types of tiny particles, which we cannot see but which can have a significant impact on our lives, on health, on agriculture, on climate," she said.

The researcher explained that to predict the behavior of particles, such as their concentration at a particular time and location, it is necessary to understand the turbulent flows present in the region that corresponds to the first hundreds of meters of the atmosphere, known as the atmospheric boundary layer. "This is a region that concentrates all the energy, gas, and particle exchanges between the atmosphere and the elements that compose the planet's surface," she said.

"The problem of turbulent flows is very complex since it involves various scales, ranging from the scale of the atmosphere itself to other very small ones, such as the turbulent vortices that transport particles. To simulate that on a computer, we need to represent all these scales, which means a significant increase in computing costs. It is a big challenge to represent all the different components in the atmosphere in a [supercomputer] system with a viable cost," said Freire.

Large-Eddy Simulation

The researcher mentioned that the atmospheric boundary layer has turbulent flows whose most faithful computational representation is obtained using a technique called Large-Eddy Simulation (LES).

"Due to its complex nature, the study of turbulence is based on the use of numerical simulations combined with experimental data analysis. For atmospheric flows, the use of the LES technique provides important indicators regarding the unique behavior of turbulence, and significant progress has been made in developing models for material and energy transport in the atmosphere in simplified conditions," she said.

"For example, the average concentration of fine particles emitted from a flat and naked soil region can be represented by a simple flow-profile ratio, a result found based on the use of LES," she said.

According to Freire, advances in computational power capacities offer an opportunity to investigate more complex problems, such as particle transport in the presence of forests and cities.

"We are using LES, an advanced numerical tool, to develop new models that enable us to explain the transport of particles in the atmosphere. This can increase our understanding and our ability to predict their role in the environment," she said.

In the research with LES, Freire has also been working with professors Leandro Franco de Souza, from ICMC-USP, and Amauri Pereira de Oliveira from USP's Institute of Astronomy, Geophysics, and Atmospheric Sciences, and with David Richter, from the University of Notre Dame.

 

University of Bonn, Dutch state-of-the-art algorithms identify aggressive breast cancer

The study by the University of Bonn shows to what extent cancer research can benefit from the results of mouse models

Aggressive forms of breast cancer often manipulate the immune response in their favor. This manipulation is revealed in humans by the same immunological "signature" as in mice. This is shown by a study carried out by scientists from the University of Bonn together with Dutch colleagues. Their method makes it possible to obtain an indication of the prognosis of the disease using patients' tumor tissue. The results are published in the journal Cell Reports.

When a tumor starts to grow in the body, it usually does not go unnoticed by the immune system: Macrophages, a certain form of the body's own defense troops, migrate to the cancer cells. They are supposed to flow around the diseased cells, digest them and thus eliminate them. But sometimes tumor cells manage to escape their adversaries. Not just that: They even use the macrophages for their own purposes and grow even faster as a result. CAPTION Macrophages (brown), the scavengers of the immune system, migrate into the diseased tissue (cancer cells: blue) without destroying it.  CREDIT © Karin E. de Visser/the Netherlands Cancer Institute{module In-article} 

To do this, they reprogram the immune cells: They ensure that certain genes in the macrophages are switched off and others switched on. This changes the genetic "signature" of the macrophages. "This changed signature, in turn, reveals whether the tumor has a good or bad prognosis," explains Dr. Thomas Ulas from the LIMES Institute (the acronym stands for "Life and Medical Sciences") at the University of Bonn.

Gene activity also depends on the tissue

In order to identify the changes caused by the tumor, it is necessary to know which genes are normally active in the macrophages. However, this varies considerably, depending on the organ in which the scavenger cells perform their service. Experts also speak of "tissue painting": The tissue makes its mark on the immune cells.

In addition, tumor-induced changes are not always identical but differ from one patient to another. "Depending on which mutation is responsible for breast cancer, other functions are switched on or off in the macrophages," stresses Ulas. It is therefore very difficult to study these complex correlations directly using patients' tissue samples.

To overcome this obstacle, the scientists cooperated with a working group from the Netherlands. Tumor biologist Prof. Dr. Karin de Visser has been working for many years on mouse lines affected by certain, strictly defined types of breast cancer. "We have now searched these animals for the signature of the scavenger cells in the tumors," says Ulas. To this end, the bioinformatics expert and his colleagues isolated macrophages from mice affected by breast cancer and compared them with those from healthy breast tissue. They were able to identify the genetic differences between the scavenger cells using state-of-the-art supercomputer algorithms.

Mouse results transferable to humans

They also found almost identical signatures in the scavenger cells of many breast cancer patients. "In this case, it was possible to transfer the mouse results directly to humans," explains Prof. Dr. Joachim Schultze, head of the Genomics and Immunoregulation team at the LIMES Institute. "However, the prerequisite was that the patients suffered from the same form of breast cancer as the animals." The results also demonstrate how important it is to develop specific mouse models depending on the type of cancer.

The results can be used not just to predict tumor aggressiveness: After all, the signature also provides information on the cancer cells' survival strategies. This may eventually lead to the development of new countermeasures. Ulas: "However, it will certainly take many years for new treatment options to emerge, if any."