A simple vibration test can help oil and gas companies prevent pipeline spills in a way that is faster and cheaper than conventional methods, a UBC study shows.

The study, conducted at UBC's Okanagan campus, found pipeline imperfections could be identified by "tapping" the side of a pipe and then measuring the resulting vibrations, known as modal analysis, against the vibrations predicted by supercomputer models.

"After developing the mathematical platform and entering it into a computer, we can predict what the level of vibration should be if the pipeline that is being tapped is free of imperfections," says Hadi Mohammadi, an assistant professor of engineering. "When I conducted the tap test on actual pipeline material and looked at the resulting patterns of vibrations, weak points could quickly be identified.

"This method of attaching small machines to pipelines that are above ground and having them tap and measure vibrations offer a faster and cheaper way to find cracks or patches of internal rust than the conventional method of using imaging techniques."

Mohammadi, whose research area focuses on bio-engineering, began employing his "tap test" theory on pipeline material after testing its validity on human bones.

The "tap test" was equally useful in identifying areas of deficient bone density, which could be used to help identify conditions such as osteoporosis.

Mohammadi's study was recently published in the Journal of Pipeline Engineering. 

Assistant Professor Hadi Mohammadi.
Assistant Professor Hadi Mohammadi.

An international team of scientists from Los Alamos National Laboratory, Imperial College London (IC), and Kiel University (CAU), headed by Professor Michael Bonitz of the Institute of Theoretical Physics and Astrophysics at CAU and Professor Matthew Foulkes of the Department of Physics at IC, has achieved a major breakthrough in the description of warm dense matter – one of the most active frontiers in plasma physics and material science. This exotic state of matter is characterized by the simultaneous presence of strong quantum effects, thermal excitations, and strong interaction effects, and differs completely from the usual solid, liquid, gas and plasma states commonly found on Earth. A full understanding of the interplay of these three effects has been lacking until now. The tri-national team of scientists published their research findings in the current edition of Physical Review Letters. 

With its temperature from ten to ten thousand times room temperature and its density anywhere between one and a few thousand times warm dense matter is completely different from normal solids you can find on earth. These extreme conditions exist within astrophysical objects such as planet cores and white dwarf atmospheres. “An improved understanding of warm dense matter will be the key to answering fundamental questions in astrophysics. For example, it will help to determine the age of galaxies”, explains Bonitz.

It is now possible to create warm dense matter routinely in the lab by exciting solids with powerful lasers. The warm dense conditions last for only a few microseconds, but this is long enough to allow measurements to be made and the new data has sparked an explosion of activity in the field. In particular, researchers need to overcome the challenges arising from this extreme state of matter in order to master inertial confinement fusion, which is considered the most promising source of energy in the future. 

Until now, a theoretical description of warm dense matter has required crude approximations of the underlying physics. Exact simulation methods have been developed, but they were restricted to small model systems containing only a few particles in a limited parameter range. The research teams around Bonitz and Foulkes overcame these obstacles by using three complementary simulation techniques recently developed in Kiel and London, in combination with a novel approach to remove the errors that arise from the limited size of the systems simulated. This allowed them to obtain the first accurate thermodynamic results for the electron component in warm dense matter. Their simulations required an enormous amount of supercomputer resources and would have taken about 200 years if carried out on a single desktop computer.

“Our results will constitute the basis of future warm dense matter research”, says Bonitz.

Original publication:
Tobias Dornheim, Simon Groth, Travis Sjostrom, Fionn D. Malone, W.M.C Foulkes and Michael Bonitz: Ab initio Quantum Monte Carlo simulation of the warm dense electron gas in the thermodynamic limit. Physical Review Letters, vol. 117, article 156403 (2016) 

The Excess project includes methodology and software as well as hardware, such as the processors from Movidius, in order to achieve impressive results regarding energy efficiency.

A European research project led by Chalmers University of Technology has launched a set of tools that will make computer systems more energy efficient – a critical issue for modern supercomputing. Using the framework of the project programmers has been able to provide large data streaming aggregations 54 times more energy efficient than with standard implementations.

Energy consumption is one of the key challenges of modern computing, whether for wireless embedded client devices or supercomputing centers. The ability to develop energy efficient software is crucial, as the use of data and data processing keeps increasing in all areas of society. The need for power efficient computing is not only due to the environmental impact. Rather, we need energy efficient computing in order to even deliver on the trends predicted.

The EU funded Excess project, which finishes August 31, set out three years ago to take on what the researchers perceived as a lack of holistic, integrated approaches to cover all system layers from hardware to user level software, and the limitations this caused to the exploitation of the existing solutions and their energy efficiency. They initially analyzed where energy-performance is wasted, and based on that knowledge they have developed a framework that should allow for rapid development of energy efficient software production.

“When we started this research program there was a clear lack of tools and mathematical models to help the software engineers to program in an energy efficient way, and also to reason abstractly about the power and energy behavior of her software” says Philippas Tsigas, professor in Computer Engineering at Chalmers University of Technology, and project leader of Excess. “The holistic approach of the project involves both hardware and software components together, enabling the programmer to make power-aware architectural decisions early. This allows for larger energy savings than previous approaches, where software power optimization was often applied as a secondary step, after the initial application was written”

The Excess project has taken major steps towards providing a set of tools and models to software developers and system designers to allow them to program in an energy efficient way. The tool box spans from fundamentally new energy-saving hardware components, such as the Movidius Myriad platform, to sophisticated efficient libraries and algorithms.

Tests run on large data streaming aggregations, a common operation used in real-time data analytics, has shown impressive results. When using the Excess framework, the programmer can provide a 54 times more energy efficient solution compared to a standard implementation on a high-end PC processor. The holistic Excess approach first presents the hardware benefits, using an embedded processor, and then continues to show the best way to split the computations inside the processor, to even further enhance the performance.

Movidius, a partner in the Excess project and developers of the Myriad platform of vision processors, has integrated both technology and methodology developed in the project into their standard development kit hardware and software offering. In the embedded processor business, there has been a gradual migration of HPC class features getting deployed on embedded platforms. The rapid development in autonomous vehicles such as cars and drones, driving assist systems, and also the general development of home assist robotics (e.g. vacuum cleaners and lawnmowers) has led to the porting of various computer vision algorithms to embedded platforms. Traditionally these algorithms were developed on high performance desktop computers and systems, making them difficult to re-deploy to embedded systems. Another problem was that the algorithms were not developed with energy efficiency in mind. But the Excess project has enabled and directed the development of tools and software development methods to aid the porting of HPC applications to the embedded environment in an energy efficient way.

Soil is a major carbon pool, whose impact on climate change is still not fully understood. According to a recent study, however, soil carbon stocks and could be modeled more accurately by factoring in the impacts of both soil nutrient status and soil composition. Determining the volume of carbon dioxide efflux from soil is important to enabling better choices in forest management with respect to curbing climate change.

Knowledge of the extent and regional variation of soil carbon stocks is vital.  Current soil carbon stock predictions are unreliable and it is difficult to estimate the volume of carbon dioxide efflux  that is emitted from soil as a result of climate change.

The study, which was a joint venture of the Natural Resources Institute Finland (Luke), the Swedish University of Agricultural Sciences (SLU) and the Japanese Forestry and Forest Products Research Institute (FFPRI), focused on analysing a Swedish soil carbon inventory data set and comparing actual soil carbon measurements to soil carbon models.

Limitations of soil carbon models

The study revealed that the models predicted accurate soil carbon stock levels in typical barren and mesotrophic forests. The carbon stocks of fertile and fine soils, on the other hand, were underestimated in the models.

The soil carbon stock predictions of the models were mainly dependent on the quantity and quality of the litter generated by vegetation. This leads us to the conclusion that the models are incapable of predicting carbon stocks deeper within fertile and fine-grained soils.

Incorporating the long-term effects of carbon into the models would make them more accurate and allow us to predict the carbon cycle more reliably in the future, explains Senior Research Scientist Aleksi Lehtonen from the Natural Resources Institute Finland, who contributed to the study.

The Purdue University-led Open Ag Data Alliance and partner Servi-Tech, Inc. have announced a commercial demonstration of its Real-Time Connections initiative, continuing their mission to help farmers better use data in their daily decisions across all of their operations.

Servi-Tech, the largest crop consulting and agronomic services company in the nation, and OADA worked together to harness the power of OADA application programming interface (API) standards to publish an open, nonproprietary cloud-based data exchange for weather and soil moisture data. By publishing the data exchange paradigm as open source, it is free for anyone to make contributions and use.

The open source computer code is available now and resides at https://github.com/oada/oada-formats. The project will be discussed at the Data Standards session during the InfoAg 2016 conference Tuesday through Thursday (Aug. 2-4) in St. Louis.

"Servi-Tech is pleased to support any development to make weather and soil moisture data more accessible and understandable to farmers in their daily operations," said Greg Ruehle, president and CEO of Servi-Tech. "Having an open and freely available data standard for real-time reporting, independent of any specific manufacturer or service provider, helps to make better day-to-day agronomic decisions. When ag data is recorded and transferred in a uniform way across any technology platform, farmers achieve better outcomes."

The alliance sees real-time API connection as complementing other open data exchange projects, such as from AgGateway, said Aaron Ault, project lead for OADA and senior research engineer for the Open Ag Technology and Systems Group at Purdue. He also is a grain and beef farmer. 

"When Servi-Tech approached us about working together on a project to help design an API for weather and soil moisture data using the OADA open source framework, we recognized a great opportunity to get some real data automatically flowing in a fast, modern, published and secure way that others across the ag industry can use and benefit from," Ault said. "We are excited to see how this open real-time API can be built upon and enhanced by the community to further promote data interoperability in the future."

The Open Ag Data Alliance was formed in early 2014 as an open source project with widespread industry support and headed by the Open Ag Technology and Systems Group (OATS) at Purdue. Its goal is to help the industry get data flowing automatically for farmers in agriculture so they can reap the benefits of making data-driven decisions and stop wrangling data and incompatible systems. The alliance has since grown to over 25 commercial partners worldwide.

The Real-Time Connections initiative is a core feature of the upcoming OATS Center at Purdue, designed to bring together agriculture industry partners to rapidly develop data-integrated systems that harness the power of open source for grass-roots innovation and adoption.

More information about OADA is available at its website http://openag.io, and information about OATS is at https://engineering.purdue.edu/oatsgroup/.

 

Page 1 of 7