FAU's mangrove root model may hold the key to preventing coastal erosion

Study first to quantify optimal mangrove root hydrodynamic with a predictive model

Mangrove vegetation, which grows naturally in subtropical shorelines, provides a wide range of ecosystem functions such as reducing coastal erosion, promoting biodiversity, and removing nitrogen, phosphorus, and carbon dioxide. These vital ecological functions are influenced by the water flow around the intricate mangrove roots, which create a complex energetic process that mixes up sediments and generates a depositional region behind the roots. How these mangrove roots interact with water flow is believed to be a key element in mitigating coastal erosion.

Accurately projecting hydrodynamic erosion and the essential amount of mangrove species has been a challenge for managers and restoration experts to forecast a successful component of project designs. That is because they need two critical pieces of information: characterization of the near-bed boundary layer of the mangrove roots and their effect on the mangrove root erosion; and a quantitative understanding of mangrove root erosion and the habitat requirements based on the optimal porosity.

Florida Atlantic University's Oscar M. Curet, Ph.D., an associate professor in the Department of Ocean and Mechanical Engineering within the College of Engineering and Computer Science, spearheaded the research with his co-authors and is the first to quantify the optimal mangrove root hydrodynamic with a predictive model. For the study, they used simplified mangrove root-type models with different porosities to investigate the impact of porosity on the initial motion of the sediments, which is critical to the evolution of shorelines, delta, and lands. This predictive model takes into account the mangrove roots' porosity and the near-bed turbulence effect.

The study identified the pivotal role of mangrove root porosity and provides insight into the sediment transport and erosion processes that govern the evolution of the shapes of shorelines. The field studies' wide spatiotemporal parameters could extend the results of the current research to successfully predict mangrove erosion outcomes on estuarine shorelines.

"Our data address the first informational need for global restoration communities with mangrove habitats and will bring about opportunities for interdisciplinary collaboration with the environmental and ecological engineering community," said Amirkhosro Kazemi, Ph.D., lead author and a post-doctoral research associate in the Department of Ocean and Mechanical Engineering. "Furthermore, understanding the hydrodynamics and scaling of this problem could contribute to the design and development of a bio-inspired mangrove-like system for coastal protection globally, especially in the subtropical regions where mangrove growth is possible."

Characterizing the hydrodynamics of mangrove-like structures could explain the primary mechanisms for its resilience and by which mangrove roots can withstand high-energy fluid conditions. For example, the researchers observed that most sediments are eroded for the case with high porosity (less blockage), and the sediment deposition region for the low porous patch (φ = 47 percent) had the maximum area among others signifying an optimal porosity to mitigate erosion for a fixed root configuration. This information has the potential to improve future coastal infrastructure design with bio-mimetic mangrove-like structures.

The study also suggests that optimal porosity design of shoreline may add habitat flexibility to sites that are on the borderline of mangrove habitat suitability. This optimal porosity would affect an increase in the critical velocity at which the sediment transport starts. The increment in the critical velocity has biological importance as it could potentially increase nutrients around the roots, increase energy dissipation to withstand high flow speeds, and control changes to the substrate bottom to facilitate the propagation of mangrove swamps.

"Roots that do not exceed the critical porosity for maximum energy dissipation may have adaptive benefits, for example, the ability to tolerate brackish waters in depositional environments," said Curet. "Increasing the mangrove species through pre-restoration grading can potentially increase the likelihood of decreasing erosion success, with a higher energy dissipation that increases the resistance of mangrove roots to the energy in tidal flows."

Down the road, Kazemi, Curet, and co-author Luciano Castillo, Ph.D., Kenninger Professor of Renewable Energy and Power Systems in Mechanical Engineering, School of Mechanical Engineering, Purdue University, propose using machine learning (ML) algorithms that can be an alternative way to predict the processes of sediment transport in three-dimensional directions under oscillating flow conditions, by utilizing the available dataset of video images and the state-of-the-art deep learning and ML algorithms. With abundant sediment transport data, ML algorithms can find the patterns and structures of the data to produce a viable morphodynamic model. Based on the training dataset, these algorithms can learn, infer and predict physical phenomena that can potentially be utilized in several applications such as flow control, energy harvesting, and erosion mitigation.

"This important research by Dr. Kazemi, professor Castillo, and Dr. Curet contribute to fill a gap in understanding the near-bed flow and step forward accurate prediction of sediment transport in vegetated regions, which contributes to shaping them in nature," said Stella Batalama, Ph.D., dean, College of Engineering and Computer Science. "The optimal configuration porosity range and the critical velocity presented in this study can provide useful guidance for coastal managers restoring estuarine mangrove forests or planting mangroves as part of living shoreline stabilization."

Russian mathematician boosts domain decomposition method for asynchronous parallel supercomputing

In Moscow, RUDN University mathematician and his colleagues from France and Hungary developed an algorithm for parallel supercomputing, which allows solving applied problems, such as electrodynamics or hydrodynamics. The gain in time is up to 50%. The results are published in the Journal of Computational and Applied Mathematics.

Parallel supercomputing methods are often used to process practical problems in physics, engineering, biology, and other fields. It involves several processors joined in a net to simultaneously solve a single problem -- each has its own small part. The way to distribute the work between the processors and make them "communicate" with each other is a choice based on the specifics of a particular problem. One possible method is domain decomposition. The study domain is divided into separate parts -- subdomains -- according to the number of processors. When that number is very high, especially in heterogeneous high-performance computing (HPC) environments, asynchronous processes constitute a valuable ingredient. Usually, Schwarz methods are used, in which the subdomains overlap each other. This provides accurate results but does not suit when the overlap is not straightforward. RUDN University Mathematician and his colleagues from France and Hungary proposed a new algorithm that makes the asynchronous decomposition easier in many structural cases -- the subdomains do not overlap; the result remains accurate with less time needed for computation. RUDN University mathematician and his colleagues from France and Hungary developed an algorithm for parallel computing, which allows solving applied problems, such as electrodynamics or hydrodynamics. The gain in time is up to 50%.

"Until now, almost all investigations of asynchronous iterations within domain decomposition frameworks targeted methods of the parallel Schwarz type. A first, and sole, attempt to deal with primal nonoverlapping decomposition resulted in simultaneously iterating on the subdomains and on the interface between them. That means that computation scheme is defined on the whole global domain", Guillaume Gbikpi-Benissan, Engineering Academy of RUDN University.

Mathematicians proposed an algorithm based on the Gauss-Seidel method. The essence of the innovation is that the calculation algorithm is not run simultaneously on the entire domain, but alternately on the subdomains and the boundaries between them. As a result, the values obtained during each iteration within the subdomain can be immediately used for calculations on the boundary at no additional cost.

Mathematicians tested the new algorithm on the Poisson equation and the linear elasticity problem. The first one is used, for example, to describe the electrostatic field, the second one is used in hydrodynamics, to describe the motion of liquids. The new method was faster than the original one for both equations. A gain of up to 50% was indeed achieved -- with 720 subdomains, the computation of the Poisson equation took 84 seconds while the original algorithm spent 170 seconds. Moreover, the number of synchronous alternating iterations decreases with an increase in the number of subdomains.

"It is a quite interesting behavior which can be explained by the fact that the ratio of alternation increases as the subdomains sizes are reduced and more of the interface appears. This work, therefore, encourages for further possibilities and new promising investigations of the asynchronous computing paradigm", Guillaume Gbikpi-Benissan, Engineering Academy of RUDN University.

CMU builds machine learning that mines nature for drug discovery

Researchers from Carnegie Mellon University's Computational Biology Department in the School of Computer Science have developed a new process that could reinvigorate the search for natural product drugs to treat cancers, viral infections, and other ailments.

The machine learning algorithms developed by the Metabolomics and Metagenomics Lab match the signals of a microbe's metabolites with its genomic signals and identify which likely correspond to a natural product. Knowing that researchers are better equipped to isolate the natural product to begin developing it for a possible drug.

"Natural products are still one of the most successful paths for drug discovery," said Bahar Behsaz, a project scientist in the lab and lead author of a paper about the process. "And we think we're able to take it further with an algorithm like ours. Our computational model is orders of magnitude faster and more sensitive." Researchers in the Computational Biology Department have developed a new process that could reinvigorate the search for natural product drugs to treat cancers, viral infections and other ailments.

In a single study, the team was able to scan the metabolomics and genomic data for about 200 strains of microbes. The algorithm not only identified the hundreds of natural product drugs the researchers expected to find, but it also discovered four novel natural products that appear promising for future drug development. 

The team has developed NRPminer, an artificial intelligence tool to aid in discovering non-ribosomal peptides (NRPs). NRPs are an important type of natural product and are used to make many antibiotics, anticancer drugs, and other clinically used medications. They are, however, difficult to detect and even more difficult to identify as potentially useful.

"What is unique about our approach is that our technology is very sensitive. It can detect molecules with nanograms of abundance," said Hosein Mohimani, an assistant professor and head of the lab. "We can discover things that are hidden under the grass."

Most of the antibiotic, antifungal, and many antitumor medications discovered and widely used have come from natural products.

Penicillin is among the most used and well-known drugs derived from natural products. It was, in part, discovered by luck, as are many of the drugs made from natural products. But replicating that luck is difficult in the laboratory and at scale. Trying to uncover natural products is also time and labor-intensive, often taking years and millions of dollars. Major pharmaceutical companies have mostly abandoned the search for new natural products in the past decades.

By applying machine learning algorithms to the study of genomics, however, researchers have created new opportunities to identify and isolate natural products that could be beneficial.

"Our hope is that we can push this forward and discover other natural drug candidates and then develop those into a phase that would be attractive to pharmaceutical companies," Mohimani said. "Bahar Behsaz and I are expanding our discovery methods to different classes of natural products at a scale suitable for commercialization."

The team is already investigating the four new natural products discovered during their study. The products are being analyzed by a team led by Helga Bode, head of the Institute for Molecular Bioscience at Goethe University in Germany, and two have been found to have potential antimalarial properties.