This seems to be true for the research group led by Mengchu Zhou, a Distinguished Professor of electrical and computer engineering at the New Jersey Institute of Technology.
In a recent study published in the IEEE/CAA Journal of Automatica Sinica, Zhou uses the Petri net to analyze and model a microgrid.
The Petri net, named after its inventor Carl Adam Petri, is a mathematical modeling language that has been in existence for several decades. Originally invented to illustrate chemical processes, the Petri net has not only survived the test of time, but also continues to expand as scientists devise ways to use it to solve new problems.
Modeling a power grid is a new problem, specifically a microgrid that incorporates renewable energy sources. According to the U.S. Department of Energy, microgrids are more efficient and are often more environmentally friendly. More importantly, a microgrid can disconnect from the main power grid and function autonomously, rendering it less susceptible to power grid breakdowns and more reliable during emergencies.
In Zhou's study, he analyzes a microgrid consisting of a wind turbine, photovoltaic cell, battery, and diesel generator. He uses the Hybrid Petri net (HPN) to model this microgrid to account for both discrete and continuous events. Examples of discrete events include instances where the photovoltaic cell or battery is either turned on or off. Continuous events include the amounts of energy stored in the cell or battery, which are real values that can change continuously over time.
Zhou's modeling illustrates how the microgrid behaves under different conditions, such as strong wind or weak sunlight. Such conditions affect the ability of the wind turbine and photovoltaic cell to support the microgrid's energy demands, and determine whether the generator should come on or if the battery needs to discharge its energy. This analysis yields data that present a clear picture of various schemes that the microgrid can operate within, as well as their respective outcomes. This information also helps engineers estimate the time and cost required for each grid component to switch its operating state.
Zhou's team is hopeful that HPN modeling can help identify the most efficient operational schemes for different microgrids and thus improve microgrid design. He wants to further develop this modeling method to analyze more complex microgrids that can meet our increasing energy demands.
Researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo led the development of a new extensible wiring technique capable of controlling superconducting quantum bits, representing a significant step towards to the realization of a scalable quantum supercomputer.
"The quantum socket is a wiring method that uses three-dimensional wires based on spring-loaded pins to address individual qubits," said Jeremy Béjanin, a PhD candidate with IQC and the Department of Physics and Astronomy at Waterloo. He and Thomas McConkey, PhD candidate from IQC and the Department of Electrical and Computer Engineering at Waterloo, are lead authors on the study that appears in the journal Physical Review Applied as an Editors' Suggestion and is featured in Physics. "The technique connects classical electronics with quantum circuits, and is extendable far beyond current limits, from one to possibly a few thousand qubits."
One promising implementation of a scalable quantum supercomputing architecture uses a superconducting qubit, which is similar to the electronic circuits currently found in a classical computer, and is characterized by two states, 0 and 1. Quantum mechanics makes it possible to prepare the qubit in superposition states, meaning that the qubit can be in states 0 and 1 at the same time. To initialize the qubit in the 0 state, superconducting qubits are brought down to temperatures close to -273 degrees Celsius inside a cryostat, or dilution refrigerator.
To control and measure superconducting qubits, the researchers use microwave pulses. The pulses are typically sent from dedicated sources and pulse generators through a network of cables connecting the qubits in the cryostat's cold environment to the room-temperature electronics. The network of cables required to access the qubits inside the cryostat is a complex infrastructure and, until recently, has presented a barrier to scaling the quantum computing architecture.
"All wire components in the quantum socket are specifically designed to operate at very low temperatures and perform well in the microwave range required to manipulate the qubits," said Matteo Mariantoni, a faculty member at IQC and the Department of Physics and Astronomy at Waterloo and senior author on the paper. "We have been able to use it to control superconducting devices, which is one of the many critical steps necessary for the development of extensible quantum computing technologies."
Researchers say improved understanding of hurricanes will improve our ability to forecast storms and their impacts
Many people know that tropical cyclones and hurricanes cause high winds and storm surges. But two of their other effects, heavy rainfall and inland flooding, can be just as dangerous and impact larger areas.
Most recently, inland rainfall produced by Hurricane Matthew has caused record flooding in North Carolina, with the levels of some already swollen rivers and streams continuing to rise.
According to the National Oceanic and Atmospheric Administration (NOAA), more than 50 percent of the deaths associated with hurricanes from 1970 to 2004 were caused by fresh water flooding. And from 1981 to 2011, hurricane damage accounted for almost half--$417.9 billion--of the total monetary damage from all weather and climate disasters during that same time period (adjusted for inflation to 2011 dollars).
With the goal of providing basic information to help improve preparedness and mitigation efforts, new University of Iowa-led research published online in September in the Journal of Hydrology examined how accurate current forecasting systems are in predicting rainfall from North Atlantic tropical cyclones that reach land in the United States.
Comparing five state-of-the-art weather prediction models, researchers found current models can forecast both where and how much rainfall a tropical cyclone will produce up to two days in advance. However, the forecast's accuracy decreased significantly when the prediction window increased to five days. The researchers' findings were based on 15 North Atlantic hurricanes that came within at least 500 kilometers of the U.S. coastline from 2007 to 2012.
Gabriele Villarini, UI associate professor of civil and environmental engineering and corresponding author on the paper, says researchers honed in on predicting the impacts of tropical cyclones because that information is generally more useful than typical forecasts that predict how many storms are expected in a season.
"The more specific the information we can provide is, the more useful it will be. This is why we have been moving toward predicting U.S. landfalling tropical cyclone activity and the associated rainfall," he says.
Villarini, also an associate research engineer at the UI's renowned Iowa Flood Center, says while a 48-hour lead time is a good starting point in terms of warning, he will continue to conduct more research to improve these predictions.
"By improving our understanding of the processes that drive tropical cyclones and hurricanes, we will be better positioned to improve our ability to forecast these events and their impacts with longer and longer lead times," he says.
Gabriel Vecchi, head of the climate variations and predictability group at NOAA's Geophysical Fluid Dynamics Lab and another author on the paper, says decades of weather prediction data show that forecasts have improved--and will improve--as scientists learn more about hurricanes.
"We can't do anything about the past," he says. "The goal of this work is to provide better information in the future."
Vecchi, who has collaborated with Villarini on several research projects, says he values the expertise in flooding and hydrology that Villarini and the Iowa Flood Center bring to their partnership.
"This is one of these examples of interdisciplinary work that has been incredibly fruitful," he says.
The paper also was authored by Beda Luitel while he was a graduate student at the UI.
'Dressed qubit' maintains delicate superposition long enough to allow useful calculations
Australian engineers have created a new quantum bit which remains in a stable superposition for 10 times longer than previously achieved, dramatically expanding the time during which calculations could be performed in a future silicon quantum supercomputer.
The new quantum bit, made up of the spin of a single atom in silicon and merged with an electromagnetic field - known as 'dressed qubit' - retains quantum information for much longer that an 'undressed' atom, opening up new avenues to build and operate the superpowerful quantum computers of the future.
The result by a team at Australia's University of New South Wales (UNSW), appears today in the online version of the international journal, Nature Nanotechnology.
"We have created a new quantum bit where the spin of a single electron is merged together with a strong electromagnetic field," said Arne Laucht, a Research Fellow at the School of Electrical Engineering & Telecommunications at UNSW, and lead author of the paper. "This quantum bit is more versatile and more long-lived than the electron alone, and will allow us to build more reliable quantum computers."
Building a quantum computer has been called the 'space race of the 21st century' - a difficult and ambitious challenge with the potential to deliver revolutionary tools for tackling otherwise impossible calculations, such as the design of complex drugs and advanced materials, or the rapid search of massive, unsorted databases.
Its speed and power lie in the fact that quantum systems can host multiple 'superpositions' of different initial states, which in a computer are treated as inputs which, in turn, all get processed at the same time.
"The greatest hurdle in using quantum objects for computing is to preserve their delicate superpositions long enough to allow us to perform useful calculations," said Andrea Morello, leader of the research team and a Program Manager in the Centre for Quantum Computation & Communication Technology (CQC2T) at UNSW.
"Our decade-long research program had already established the most long-lived quantum bit in the solid state, by encoding quantum information in the spin of a single phosphorus atom inside a silicon chip, placed in a static magnetic field," he said.
What Laucht and colleagues did was push this further: "We have now implemented a new way to encode the information: we have subjected the atom to a very strong, continuously oscillating electromagnetic field at microwave frequencies, and thus we have 'redefined' the quantum bit as the orientation of the spin with respect to the microwave field."
The results are striking: since the electromagnetic field steadily oscillates at a very high frequency, any noise or disturbance at a different frequency results in a zero net effect. The researchers achieved an improvement by a factor of 10 in the time span during which a quantum superposition can be preserved.
Specifically, they measured a dephasing time of T2*=2.4 milliseconds - a result that is 10-fold better than the standard qubit, allowing many more operations to be performed within the time span during which the delicate quantum information is safely preserved.
"This new 'dressed qubit' can be controlled in a variety of ways that would be impractical with an 'undressed qubit',", added Morello. "For example, it can be controlled by simply modulating the frequency of the microwave field, just like in an FM radio. The 'undressed qubit' instead requires turning the amplitude of the control fields on and off, like an AM radio.
"In some sense, this is why the dressed qubit is more immune to noise: the quantum information is controlled by the frequency, which is rock-solid, whereas the amplitude can be more easily affected by external noise".
Since the device is built upon standard silicon technology, this result paves the way to the construction of powerful and reliable quantum processors based upon the same fabrication process already used for today's computers.
The UNSW team leads the world in developing quantum computing in silicon, and Morello's team is part of the consortium of UNSW researchers who have struck a A$70 million deal between UNSW, the researchers, business and the Australian government to develop a prototype silicon quantum integrated circuit - the first step in building the world's first quantum computer in silicon.
A functional quantum computer would allow massive increases in speed and efficiency for certain computing tasks - even when compared with today's fastest silicon-based 'classical' computers. In a number of key areas - such as searching large databases, solving complicated sets of equations, and modelling atomic systems such as biological molecules and drugs - they would far surpass today's computers.They would also be enormously useful in the finance and healthcare industries, and for government, security and defence organisations.
Quantum computers could identify and develop new medicines by greatly accelerating the computer-aided design of pharmaceutical compounds (and minimising lengthy trial and error testing), and develop new, lighter and stronger materials spanning consumer electronics to aircraft. They would also make possible new types of computational applications and solutions that are beyond our ability to foresee.
A new sequencing technology, combined with a new supercomputer algorithm that can yield detailed information about complex genomes of various organisms, has been used to produce a high-quality draft genome sequence of cabernet sauvignon, the world's most popular red wine grape variety, reports a UC Davis genomics expert.
Success of the new genome assembly, which allows researchers to assemble large segments of an organism's DNA, also was demonstrated on the common research plant Arabidopsis thaliana and the coral mushroom (Clavicorona pyxidata). The findings will be reported Oct. 17 in the journal Nature Methods.
The three-pronged, proof-of-concept study used an open-source genome assembly process called FALCON-unzip, developed by Pacific Biosciences of Menlo Park, California. The study was led by Chen-Shan Chin, the firm's leading bioinformatician. Lead researcher on the cabernet sauvignon sequencing effort was Dario Cantu, a plant geneticist specializing in plant and microbial genomics in the UC Davis Department of Viticulture and Enology.
"For grapevine genomics, this new technology solves a problem that has limited the development of genomic resources for wine grape varieties," Cantu said. "It's like finally being able to uncork a wine bottle that we have wanted to drink for a long time.
"The new process provides rapid access to genetic information that cabernet sauvignon has inherited from both its parents, enabling us to identify genetic markers to use in breeding new vines with improved traits," he said.
The first genome sequence for the common grapevine, Vitis vinifera, was completed in 2007. Because it was based on a grapevine variety that was generated to simplify the genome assembly procedure, rather than a cultivated variety, that sequence lacks many of the genomic details that economically important wine grape varieties possess, Cantu said.
He noted that the new sequencing technology will enable his research group to conduct comparative studies between cabernet sauvignon and other historically and economically important wine grape varieties.
"This will help us understand what makes cabernet sauvignon cabernet sauvignon," he said.
Outmaneuvering climate change:
"The new genomic information that will be generated with this new genomics approach will accelerate the development of new disease-resistant wine grape varieties that produce high-quality, flavorful grapes and are better suited to environmental changes," Cantu said.
Warmer temperatures attributed to climate change are already being recorded in many prime grape-growing regions of the world. And in California, where the value of grape crops varies widely and is heavily influenced by local climate, it is especially important that new varieties be able thrive despite warming temperatures.
"In a worsening climate, drought and heat stress will be particularly relevant for high-quality viticultural areas such as Napa and Sonoma," Cantu said.
Shedding light on a viticultural mystery:
The new sequencing effort may also answer some of the questions that have surrounded the ancestry of cabernet sauvignon for centuries, Cantu said.
"Having access to this genomic information is historically fascinating," Cantu said, noting that the cabernet sauvignon grape variety is thought to date no later than the 17th century. He noted that in 1997 UC Davis plant geneticist Carole Meredith used DNA fingerprinting techniques to identify cabernet franc and sauvignon blanc as the two varieties that had crossed to produce cabernet sauvignon.
"Today, you can find cabernet sauvignon growing on every continent except Antarctica," Cantu said. "And because grape vines have been propagated by plant cuttings rather than grown from seed, all of the cabernet sauvignon vines are genetically identical, with the exception of some spontaneous, clonal mutations."
"Using this new genome sequencing process, we can now develop the genetic markers necessary to combine important traits into new varieties," Cantu said. "It's been 400 years since that was last done for cabernet sauvignon; we can do better than that."