CAPTION SMEAR II is a station for measuring environmental data in Hyytiälä, Finland. CREDIT Juho Aalto

We also need to share our data. So says one of the world's most prominent geoscientist, Markku Kulmala, professor of physics at the University of Helsinki, Finland, and head of the Aerosol and Haze Laboratory at the Beijing University of Chemical Technology, China.

Environmental challenges, climate change, water and food security and urban air pollution, they are all interlinked, yet each is studied as such, separately. This is not a sustainable situation, for anybody anymore. To tackle this, professor Markku Kulmala calls for a continuous, comprehensive monitoring of interactions between the planet's surface and atmosphere in his article published in Nature, January 4, 2018.

In his article "Build a global Earth observatory", he refers to his long experience of collecting environmental data. He has built a station, and not just one, but probably the most impressive station, called SMEAR II (Station for Measuring Ecosystem-Atmosphere Relationships), in the boreal forests of Finland showing how a rounded set of environmental measurements can be obtained.

Now building on a large scale, the answer is a global Earth observatory, consisting of 1,000 or more well-equipped ground stations around the world that track environments and key ecosystems fully and continuously. Data from these stations would be linked to data from satellite-based remote sensing, laboratory experiments and supercomputer models accordingly.

"Incomplete coverage from ground stations is the main limit to observations of Earth's conditions. Satellites can continuously, online 24/7, monitor some compounds, such as CO2, ozone and aerosols, almost planet-wide. But they cannot resolve processes or fluxes, or trace the hundreds more compounds of interest. Satellite data must be 'ground-truthed', professor Kulmala says.

This global observatory of 1,000 super stations needs to be established soon, within 10-15 years.

"The costs would be around €10 million (US$11.8 million) to €20 million per station, which can be compared to the building costs of the Large Hadron Collider at CERN, Geneva, Switzerland, or that of US President Donald Trump's proposed Mexican wall."

Nevertheless, a shift in how environmental data are collected and disseminated is needed, there is no question about that.

"There is a scientific interest, as well, in this data," professor Markku Kulmala says, "the researchers could find new mechanisms and feedback loops in this coherent data set." 

Nucleic acid sequencing methods, which determine the order of nucleotides in DNA fragments, are rapidly progressing. These processes yield large quantities of sequence data—some of which is dynamic—that helps researchers understand how and why organisms function like they do. Sequencing also benefits epidemiological studies, such as the identification, diagnosis, and treatment of genetic and/or contagious diseases. Advanced sequencing technologies reveal valuable information about the time evolution of pathogen sequences. Because researchers can estimate how a mutation behaves under the pressure of natural selection, they are thus able to predict the impact of each mutation—in terms of survival and propagation—on the fitness of the pathogen in question. These predictions lend insight to infectious disease epistemology, pathogen evolution, and population dynamics.

In a paper published earlier this month in the SIAM Journal on Applied Mathematics, Ryosuke Omori and Jianhong Wu develop an inductive algorithm to study site-specific nucleotide frequencies using a multi-strain susceptible-infective-removed (SIR) model. A SIR model is a simple compartmental model that places each individual in a population at a given time into one of the three aforementioned categories to compute the theoretical number of people affected by an infectious disease. The authors use their algorithm to calculate Tajima’s D, a popular statistical test that measures natural selection at a specific site by analyzing differences in a sample of sequences from a population. In a non-endemic situation, Tajima’s D can change over time. Investigating the time evolution of Tajima’s D during an outbreak allows researchers to estimate mutations relevant to pathogen fitness. Omori and Wu aim to understand the impact of disease dynamics on Tajima’s D, thus leading to a better understanding of a mutation’s pathogenicity, severity, and host specificity.

The sign of Tajima’s D is determined by both natural selection and population dynamics. “Tajima's D equals 0 if the evolution is neutral — no natural selection and a constant population size,” Omori said. “A nonzero value of Tajima's D suggests natural selection and/or change in population size. If no natural selection can be assumed, Tajima's D is a function of the population size. Hence, it can be used to estimate time-series changes in population size, i.e., how the epidemic proceeds.”

Differential equations, which model the rates of change of the numbers of individuals in each model compartment, can describe population dynamics. In this case, the population dynamics of hosts infected with the strain carrying a given sequence are modeled by a set of differential equations for that sequence, which include terms describing the mutation rate from one sequence to another. When setting up their multi-strain SIR model, Omori and Wu assume that the population dynamics of the pathogen is proportional to the disease dynamics. i.e., the number of pathogens are proportional to the number of infected hosts. This assumption allows the value of Tajima’s D to change.

In population genetics, researchers believe that the sign of Tajima’s D is affected by population dynamics. However, the authors show that in the case of a SIR deterministic model, Tajima’s D is independent of the disease dynamics (specifically, independent of the parameters for disease transmission rate and disease recovery rate). They also observe that while Tajima’s D is often negative during an outbreak’s onset, it frequently becomes positive with the passage of time. “The negative sign does not imply an expansion of the infected population in a deterministic model,” Omori said. “We also found the dependence of Tajima's D on the disease transmission dynamics can be attributed to the stochasticity of the transmission dynamics at the population level. This dependence is different from the aforementioned existing assumption about the relation between population dynamics and the sign of Tajima's D.”

Ultimately, Omori and Wu prove that Tajima’s D in a deterministic SIR model is completely determined by mutation rate and sample size, and that the time evolution of an infectious disease pathogen’s genetic diversity is fully determined by the mutation rate. “This work revealed some dependence of Tajima's D on the (disease transmission dynamics) basic reproduction number (R0) and mutation rate,” Omori said. “With the assumption of neutral evolution, we can then estimate mutation rate or R0 from sequence data.”

Given the demand for tools that analyze evolutionary and disease dynamics, the observation that Tajima’s D depends on the stochasticity of the dynamics is useful when estimating epidemiological parameters. For example, if sequences of pathogens are sampled from a small outbreak in a limited host population, then Tajima’s D depends on both the mutation rate and R0; therefore, a joint estimate of these parameters from Tajima’s D is possible. “We are applying this theoretic result to analyze real-world epidemiological data,” Omori said. “We should also see if our approach can be used to investigate non-equilibrium disease dynamics with natural selection.”

Ryosuke Omori was funded by a Japan Society for the Promotion of Science (JSPS) talented expert project, led by Hiroshi Nishiura at Hokkaido University.

This year, Nicholas Reich's Lab is leading an effort to improve influenza forecasting by increasing the collaboration between groups participating in the Centers for Disease Control's flu forecasting challenge.

Research teams, one led by biostatistician Nicholas Reich at UMass Amherst, are participating in a national influenza forecasting challenge to try to predict the onset, progress and peaks of regional flu outbreaks

Research teams, including one led by biostatistician Nicholas Reich at the University of Massachusetts Amherst, are participating in a national influenza forecasting challenge to try to predict the onset, progress and peaks of regional flu outbreaks to aid prevention and control. This year, the Reich Lab is leading an effort to improve the forecasting by increasing the collaboration between groups.

Reich explains, "Every year the Centers for Disease Control host a flu forecasting challenge. It's the most organized and public effort at forecasting any infectious disease anywhere in the world. Our lab is now in our third year of participating, and we find that each year we get a little better and learn a bit more."

"This year, we wanted to take it to the next level, so we worked with other teams year-round to develop a way that our models could work together to make a single best forecast for influenza. This entire effort is public, so anyone can go to the website and see the forecasts."

While this flu season has started earlier than usual in the northeastern and southern U.S. according to the most recent data, the forecasts are still showing a fair amount of uncertainty about how big a season it will be, says Reich. "The holiday season is a notoriously difficult time to forecast because typically fewer people go to the doctor, and yet everyone is traveling around spreading or being exposed to infections such as flu."

Reich and colleagues at UMass Amherst's School of Public Health and Health Sciences collaborate with teams at Carnegie Mellon University, Columbia University and a group at Los Alamos National Laboratory, New Mexico, in a group they have dubbed the "FluSight Network." It issues a new flu season forecast every Monday for public health researchers and practitioners that compares the flu trajectory this year to past years.

In a recent publication, Reich and colleagues state that their aim is to "combine forecasting models for seasonal influenza in the U.S. to create a single ensemble forecast. The central question is, can we provide better information to decision makers by combining forecasting models and specifically, by using past performance of the models to inform the ensemble approach." Reich adds, "We are working closely with our collaborators at the CDC to determine how to improve the timeliness and relevance of our forecasts."

To prepare for this flu season, he and colleagues spent many hours designing a standard structure that each team needed to use when submitting models. This allowed for comparison of methods over the past seven years of flu data in the U.S. They also conducted a cross-validation study of data from the past seven flu seasons to compare five different methods for combining models into a single ensemble forecast. They found that four of their collaborative ensemble methods had higher average scores than any of the individual models.

The team is now submitting forecasts from their best performing model and are posting them once a week this season to the CDC's 2017-18 FluSight Challenge. Reich estimates that there are about 20 teams this year participating in the CDC challenge nationwide, who produce about 30 different models. Each model forecasts the onset of the flu season, how it will progress over the coming few weeks, when it will peak, and how intense the peak will be compared to other seasons.

In a heavy flu season, between 5-12 percent of doctor's visits are for influenza-like-illness, and that number varies regionally in the U.S. This metric is one of the key indicators for the CDC of how bad the flu season is, and it is the measure used in the forecasting challenges.

Reich says, "Certainly for the CDC, there are policy decisions that could be impacted by these forecasts, including the timing of public communication about flu season starting and when to get vaccinated. Models can help with all of that. Also, hospitals often try to have enhanced precautions in place during a certain peak period for the disease. If you do that too early, or for too long, you run the risk of individuals getting tired of taking the extra time to comply with the policies."

Hospital epidemiologists and others responsible for public health decisions do not declare the onset of flu season lightly, Reich says. In hospitals, flu onset - a technical set of symptoms reported to physicians - triggers many extra time-consuming and costly precautions and procedures such as added gloves, masks and gowns, donning and doffing time, special decontamination procedures, increased surveillance and reduced visitor access, for example. There is also healthcare worker fatigue to consider. Hospitals want to be as effective and efficient as possible in their preparations and response to reduce time and money spent and worker burnout.

The public health effort to improve flu season forecasts is relatively recent, Reich says. "There has been tremendous progress in how we think about infectious disease forecasting in just the last five years," he notes. "If you compare that to something like weather forecasting, which has been going on for decades, we're in the middle of a long process of learning and improvement. Someday we might be able to imagine having a flu forecast on our smart phones that tells us, for example, it's an early season and I'd better get Mom to the clinic to get her vaccination early this year. We're close, but that's not here quite yet."

Credit: NRAO/AUI/NSF: D. Berry

Big data distinguish between different theoretical models

Three months of observations with the National Science Foundation's Karl G. Jansky Very Large Array (VLA) have allowed astronomers to zero in on the most likely explanation for what happened in the aftermath of the violent collision of a pair of neutron stars in a galaxy 130 million light-years from Earth. What they learned means that astronomers will be able to see and study many more such collisions.

On August 17, 2017, the LIGO and VIRGO gravitational-wave observatories combined to locate the faint ripples in spacetime caused by the merger of two superdense neutron stars. It was the first confirmed detection of such a merger and only the fifth direct detection ever of gravitational waves, predicted more than a century ago by Albert Einstein.

The gravitational waves were followed by outbursts of gamma rays, X-rays, and visible light from the event. The VLA detected the first radio waves coming from the event on September 2. This was the first time any astronomical object had been seen with both gravitational waves and electromagnetic waves.

The timing and strength of the electromagnetic radiation at different wavelengths provided scientists with clues about the nature of the phenomena created by the initial neutron-star collision. Prior to the August event, theorists had proposed several ideas -- theoretical models -- about these phenomena. As the first such collision to be positively identified, the August event provided the first opportunity to compare predictions of the supercomputer models to actual observations.

Astronomers using the VLA, along with the Australia Telescope Compact Array and the Giant Metrewave Radio Telescope in India, regularly observed the object from September onward. The radio telescopes showed the radio emission steadily gaining strength. Based on this, the astronomers identified the most likely scenario for the merger's aftermath.

"The gradual brightening of the radio signal indicates we are seeing a wide-angle outflow of material, traveling at speeds comparable to the speed of light, from the neutron star merger," said Kunal Mooley, now a National Radio Astronomy Observatory (NRAO) Jansky Postdoctoral Fellow hosted by Caltech.

The observed measurements are helping the astronomers figure out the sequence of events triggered by the collision of the neutron stars.

The initial merger of the two superdense objects caused an explosion, called a kilonova, that propelled a spherical shell of debris outward. The neutron stars collapsed into a remnant, possibly a black hole, whose powerful gravity began pulling material toward it. That material formed a rapidly-spinning disk that generated a pair of narrow, superfast jets of material flowing outward from its poles.

If one of the jets were pointed directly toward Earth, we would have seen a short-duration gamma-ray burst, like many seen before, the scientists said.

"That clearly was not the case," Mooley said.

Some of the early measurements of the August event suggested instead that one of the jets may have been pointed slightly away from Earth. This model would explain the fact that the radio and X-ray emission were seen only some time after the collision.

"That simple model -- of a jet with no structure (a so-called top-hat jet) seen off-axis -- would have the radio and X-ray emission slowly getting weaker. As we watched the radio emission strengthening, we realized that the explanation required a different model," said Alessandra Corsi, of Texas Tech University.

The astronomers looked to a supercomputer model published in October by Mansi Kasliwal of Caltech, and colleagues, and further developed by Ore Gottlieb, of Tel Aviv University, and his colleagues. In that model, the jet does not make its way out of the sphere of explosion debris. Instead, it gathers up surrounding material as it moves outward, producing a broad "cocoon" that absorbs the jet's energy.

The astronomers favored this scenario based on the information they gathered from using the radio telescopes. Soon after the initial observations of the merger site, the Earth's annual trip around the Sun placed the object too close to the Sun in the sky for X-ray and visible-light telescopes to observe. For weeks, the radio telescopes were the only way to continue gathering big data about the event.

"If the radio waves and X-rays both are coming from an expanding cocoon, we realized that our radio measurements meant that, when NASA's Chandra X-ray Observatory could observe once again, it would find the X-rays, like the radio waves, had increased in strength," Corsi said.

Mooley and his colleagues posted a paper with their radio measurements, their favored scenario for the event, and this prediction online on November 30. Chandra was scheduled to observe the object on December 2 and 6.

"On December 7, the Chandra results came out, and the X-ray emission had brightened just as we predicted," said Gregg Hallinan, of Caltech.

"The agreement between the radio and X-ray data suggests that the X-rays are originating from the same outflow that's producing the radio waves," Mooley said.

"It was very exciting to see our prediction confirmed," Hallinan said. He added, "An important implication of the cocoon model is that we should be able to see many more of these collisions by detecting their electromagnetic, not just their gravitational, waves."

Mooley, Hallinan, Corsi, and their colleagues reported their findings in the scientific journal Nature.

Today’s quantum technologies are set to revolutionize information processing, communications, and sensor technology in the coming decades. The basic building blocks of future quantum processors are, for example, atoms, superconducting quantum electronic circuits, spin crystals in diamonds, and photons. In recent years it has become clear that none of these quantum building blocks is able to meet all the requirements such as receiving and storing quantum signals, processing and transmitting them. A research group headed by Professors József Fortágh, Reinhold Kleiner and Dieter Kölle of the University of Tübingen Institute of Physics has succeeded in linking magnetically-stored atoms on a chip with a superconducting microwave resonator. The linking of these two building blocks is a significant step towards the construction of a hybrid quantum system of atoms and superconductors which will enable the further development of quantum processors and quantum networks. The study has been published in the latest Nature Communications.

Quantum states allow especially efficient algorithms which far outstrip the conventional options to date. Quantum communications protocols enable, in principle, unhackable data exchange. Quantum sensors yield the most precise physical measurement data. “To apply these new technologies in everyday life, we have to develop fundamentally new hardware components,” Fortágh says. Instead of the conventional signals used in today’s technology – bits – which can only be a one or a zero, the new hardware will have to process far more complex quantum entangled states.

“We can only achieve full functionality via the combination of different quantum building blocks,” Fortágh explains. In this way, fast calculations can be made using superconducting circuits; however storage is only possible on very short time scales. Neutral atoms hovering over a chip’s surface, due to their low strength for interactions with their environment, are ideal for quantum storage, and as emitters of photons for signal transmission. For this reason, the researchers connected two components to make a hybrid in their latest study. The hybrid quantum system combines nature’s smallest quantum electronic building blocks – atoms – with artificial circuits – the superconducting microwave resonators. “We use the functionality and advantages of both components,” says the study’s lead author, Dr. Helge Hattermann, “The combination of the two unequal quantum systems could enable us to create a real quantum processor with superconducting quantum lattices, atomic quantum storage, and photonic qubits.” Qubits are – analogous to bits in conventional supercomputing – the smallest unit of quantum signals.

The new hybrid system for future quantum processors and their networks forms a parallel with today’s technology, which is also a hybrid, as a look at your computer hardware shows: Calculations are made by microelectronic circuits; information is stored on magnetic media, and data is carried through fiber-optic cables via the internet. “Future quantum computers and their networks will operate on this analogy – requiring a hybrid approach and interdisciplinary research and development for full functionality,” Fortágh says.

The role and risks of bots, such as automated Twitter accounts, in influencing public opinion and political elections continues to provoke intense international debate and controversy. An insightful new collection of articles focused on “Computational Propaganda and Political Big Data” examines how these bots work, approaches to better detect and control them, and how they may have impacted recent elections around the globe. The collection is published in a special issue of Big Data, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers, and is available free on the Big Data website.

Guest Editors Philip N. Howard, PhD and Gillian Bolsover, DPhil, from University of Oxford, U.K. have compiled a compelling series of articles that provide a broad range of perspectives and examples on the use of bots in social and online media to achieve massive dissemination of propaganda and political messages.

In the article entitled "Social Bots: Human-Like by Means of Human Control?" coauthors Christian Grimme, Mike Preuss, Lena Adam, and Heike Trautmann, University of Münster, Germany offer a clear definition of a bot and description of how one works using Twitter as an example. The authors examine the current technical limitations of bots and how the increasing integration of humans and bots can expand their capabilities and control, leading to more meaningful interactions with other humans. This hybridization of humans and bots requires more sophisticated approaches for identifying political propaganda distributed with social bots.

The article "Detecting Bots on Russian Political Twitter," coauthored by Denis Stukal, Sergey Sanovich, Richard Bonneau, and Joshua Tucker from New York University, presents a novel method for detecting bots on Twitter. The authors demonstrate the use of this method, which is based on a set of classifiers, to study bot activity during an important period in Russian politics. They found that on most days, more than half of the tweets posted by accounts tweeting about Russian politics were produced by bots. They also present evidence that a main activity of these bots was spreading news stories and promoting certain media outlets.

Fabian Schäfer, Stefan Evert, and Philipp Heinrich, Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany, evaluated more than 540,000 tweets from before and after Japan's 2014 general election, and present their results on the identification and behavioral analysis of social bots in the article entitled "Japan's 2014 General Election: Political Bots, Right-Wing Internet Activism, and Prime Minister Shinzō Abe’s Hidden Nationalist Agenda." The researchers focused on the repeated tweeting of nearly the identical message and present both a case study demonstrating multiple patterns of bot activity and potential methods to allow for automation identification of these patterns. They also provide insights into the purposes behind the use of social and political bots.

“Big Data is proud to present the first collection of academic articles on a subject that is front and center these days,” says Big Data Editor-in-Chief Vasant Dhar, Professor at the Stern School of Business and the Center for Data Science at New York University. “While social media platforms have created a wonderful social space for sharing and publishing information, it is becoming clear that they also pose significant risks to societies, especially open democratic ones such as ours, for use as weapons by malicious actors. We need to be mindful of such risks and take action to mitigate them. One of the roles of this special issue is to present evidence and scientific analysis of computational propaganda on social media platforms. I’m very pleased with the set of articles in this special issue.”

Pukao are large, cylindrical stones made from a volcanic rock known as 'red scoria.' Weighing multiple tons, they were placed on the heads of the moai during prehistoric times, consistent with the Polynesian traditions of honoring their ancestors.  CREDIT Carl LIpo

Analysis of giant stone hats found on Rapa Nui, Chile (Easter Island) provides evidence contrary to the widely held belief that the ancient civilization had a warrior culture. According to a new study conducted by a team of researchers, including a professor at Binghamton University, State University of New York, these stone hats suggest that the people of Rapa Nui were part of a supportive and inclusive community.

Carl Lipo, anthropology professor and director of the Environmental Studies Program at Binghamton University, and a team of researchers studied the monumental statues (moai) on Rapa Nui, and the previously unacknowledged giant stone hats (pukao) that were placed atop them. Pukao are large, cylindrical stones made from a volcanic rock known as 'red scoria.' Weighing multiple tons, they were placed on the heads of the moai during prehistoric times, consistent with the Polynesian traditions of honoring their ancestors. 

The researchers produced the first study analyzing the pukao and their significance, examining the 70 multi-ton giant hats scattered around the island that have gradually eroded over time. Using photography to produce 3D supercomputer models, the researchers were able to study the pukao in greater detail and discovered that there are far more drawings carved into the hats than was previously thought.

"With the building mitigating any sense of conflict, the moai construction and pukao placement were key parts to the success of the island," said Lipo. "In our analysis of the archaeological records, we see evidence that demonstrates the prehistoric communities repeatedly worked together to build monuments. The action of cooperation had a benefit to the community by enabling sharing of information and resources."

While Easter Island is famous, the archaeological record of the island is not well-documented, said Lipo. He believes that scientists can learn a great deal from the pukao by examining this new information.

"Every time we look at the archaeological record of the island, we are surprised by what we find. There is much more to be learned from this remarkable place -- important answers that shed light on the abilities of our ancestors, as well as potential ideas for contemporary society about what it takes to survive on a tiny and remote island," said Lipo.

Researchers in Brazil calculated the overall electron structure of the vacancy region of a crystal lattice through the unprecedented use of a hybrid functional method, which yielded results compatible with experimental data.

A study conducted at the University of São Paulo's Physics Institute (IF-USP), Brazil, has resolved a longstanding controversy dogging the international community of researchers dedicated to investigating defects in graphene. The controversy is related to the calculation of the overall electronic structure of defects. This configuration, which comprises many variables, was described in different ways depending on the researcher and the model used. The solution, which is identical for all models and compatible with experimental findings, was obtained by Chilean Ana María Valencia García and her PhD supervisor, Marília Junqueira Caldas, Full Professor at IF-USP.

An article authored by both researchers has been published in the journal Physical Review B with the title "Single vacancy defect in graphene: Insights into its magnetic properties from theoretical modeling". The journal's editors chose one of the figures from the article for inclusion in the Kaleidoscope section, which promotes interest in the esthetics of physics by featuring images selected for their artistic appeal.

García received a PhD scholarship from Chile's National Scientific & Technological Research Commission (CONICYT), while Caldas was supported by the National Organic Electronics Institute (INEO), which is funded jointly by FAPESP and Brazil's National Council for Scientific & Technological Development (CNPq).

"There were divergences in the community regarding whether the vacancy formed by removing a single carbon atom from a graphene sheet's crystal lattice causes a weak or strong magnetic moment, and regarding the strength of the magnetic interaction between vacancies," Caldas said. The vacancy prompts the surrounding atoms to rearrange themselves into new combinations to accommodate the absence of an atom, forming electron clusters known as "floating orbitals" at the vacant site.

Three important variables are associated with the phenomenon: electron density, i.e., how the electrons are distributed; electron levels, i.e., the energy levels occupied by the electrons; and magnetic moment, i.e., the torque produced in the electrons by an external magnetic field.

First-hand use of hybrid method in graphene

Reflecting on the divergence, Caldas finds strange that its proponents are all excellent researchers affiliated with renowned international institutions. Studies conducted by the research revealed that the divergent values derived from the use of different simulation methods.

"There are two ways to calculate the overall electron structure of the vacancy region, both derived from quantum mechanics: the Hartree-Fock (HF) method, and density functional theory (DFT). In DFT the calculation is performed by making each electron interact with average electron density, which includes the electron in question. In HF the operator used excludes the electron and considers only its interaction with the others. HF produces more precise results for electron structure but the calculation is far more laborious," Caldas said.

"The two methods are often combined by means of hybrid functionals, which have been mentioned in the scientific literature since the end of the twentieth century. I worked with them myself some time ago in a study on polymers, but they had never been used in the case of graphene. What Ana María [Valencia García] and I did was discover the hybrid functional that best describes the material. Applied to several models using supercomputer simulation, our hybrid functional produced the same result for them all and this result matched the experimental data."

Besides resolving the controversy, which had lasted years, and having one of its images selected for esthetic value, another interesting aspect of this research is the problem that motivated it. "We came to it via the interest aroused by a material known as anthropogenic dark earth or ADE," Caldas explained. "ADE is a kind of very dark, fertile soil found in several parts of the world including the Amazon. It retains moisture even at high temperatures and remains fertile even under heavy rain. It's called anthropogenic because its composition derives from middens and cultivation by indigenous populations in the pre-Columbian period at least two millennia ago. This intriguing material was known to have resulted from multi-stacked layers of graphene nanoflakes. It was our interest in ADE that led us to study the phenomenon of vacancy in graphene sheets."

In conclusion, it should be noted that there are potential applications of vacancy in graphene sheets, since information can be encoded in the defect and not in the entire structure. Much more research will be needed before applications can be developed, however.

Artificial intelligence (AI) is giving researchers at the University of Waterloo new insights to help reduce wear-and-tear injuries and boost the productivity of skilled construction workers.

Studies using motion sensors and AI software have revealed expert bricklayers use previously unidentified techniques to limit the loads on their joints, knowledge that can now be passed on to apprentices in training programs.

"The people in skilled trades learn or acquire a kind of physical wisdom that they can't even articulate," said Carl Haas, a professor of civil and environmental engineering at Waterloo. "It's pretty amazing and pretty important."

Surprisingly, the research shows master masons don't follow the standard ergonomic rules taught to novices. Instead, they develop their own ways of working quickly and safely. 

Examples include more swinging than lifting of blocks and less bending of their backs.

"They're basically doing the work twice as fast with half the effort - and they're doing it with higher quality," said Haas, who leads the research with Eihab Abdel-Rahman, a systems design engineering professor at Waterloo. "It's really intriguing."

In their first study, the researchers analyzed data from bricklayers of various experience levels who wore sensor suits while building a wall with concrete blocks. The data showed experts put less stress on their bodies, but were able to do much more work.

A followup study was done to determine how master masons work so efficiently. It involved the use of sensors to record their movements and AI supercomputer programs to identify patterns of body positions.

The researchers now plan to do more in-depth study of how the experts move on the job.

"Skilled masons work in ways we can show are safer, but we don't quite understand yet how they manage to do that," said Hass, who compares their skill to a professional golf swing. "Now we need to understand the dynamics."

Musculoskeletal injuries are a significant problem in bricklaying, causing many apprentices to drop out and many experienced workers to prematurely wear out.

As part of their work, the researchers are now developing a system that uses sensor suits to give trainees immediate feedback so they can modify their movements to reduce stress on their bodies.

"There is an unseen problem with craft workers who are just wearing out their bodies," he said. "It's not humane and it's not good for our economy for skilled tradespeople to be done when they're 50."

  1. SuperComputational modeling key to design supercharged virus-killing nanoparticles
  2. UW's Hyak supercomputer overcomes obstacles in peptide drug development
  3. Brigham and Women's Hospital's Golden findings show potential use of AI in detecting spread of breast cancer
  4. NUS scientist develops new toolboxes for quantum cybersecurity
  5. UCLA chemists synthesize narrow ribbons of graphene using only light, heat
  6. New machine learning algorithm recognizes distinct dolphin clicks in underwater recordings
  7. UB's Pokhriyal uses machine learning framework to create new maps for fighting extreme poverty
  8. SwRI’s Marchi models the massive collisions after moon formation remodeled early Earth
  9. Canadian swarm-based supercomputer simulation strategy proves significantly shorter
  10. German astrophysicist simulates the size of neutron stars on the brink of collapse
  11. Patkar’s new supercomputational framework shows shifting protein networks in breast cancer may alter gene function
  12. Army researchers join international team to defeat disinformation cyberattacks
  13. John Hopkins researcher Winslow builds new supercomputer model sheds light on biological events leading to sudden cardiac death
  14. Italian scientist Baroni creates new method to supercompute heat transfer from optimally short MD simulations
  15. French researchers develop supercomputer models for better understanding of railway ballast
  16. Finnish researchers develop method to measure neutron star size uses supercomputer modeling based on thermonuclear explosions
  17. Using machine learning algorithms, German team finds ID microstructure of stock useful in financial crisis
  18. Russian prof Sergeyev introduces methodology working with numerical infinities and infinitesimals; opens new horizons in a supercomputer called Infinity
  19. ASU, Chinese researchers develop human mobility prediction model that offers scalability, a more practical approach
  20. Marvell Technology buys rival chipmaker Cavium for $6 billion in a cash-and-stock deal

Page 5 of 42