A supercomputer model that aims to provide physical information on the Bangladesh delta to policy makers there, has received the 'impact' award from the national super-computing facility, ARCHER. This is the first model to link the open ocean with the limit of tidal interaction in Bangladesh, and has been produced by the National Oceanography Centre (NOC) scientist Dr Lucy Bricheno using the ARCHER facility.

This model predicts changes in tidal water level and salinity in the delta region, and could be used to make decisions about how to manage the physical environment, such as where to take irrigation water from and what crops to grow. 

An Animated Map of the Bangladesh Delta

Loading the player...

Future scenarios forecast by the model show the tidal range increasing by up to half a metre in places, which could see a large area of the delta flooded during high tide; affecting farmland, and the Sundarbans mangrove forests (a UNESCO world heritage site)

Dr Bricheno said "I am really pleased that this award has recognised the important potential social and economic impacts of this model.....this region is home to large numbers of people whose wellbeing is critically dependant on the land, and so are vulnerable to changes in the physical environment. By providing high-quality evidence and forecasts, the outputs of the model could really help policy makers to make more informed decisions about how to best manage that environment.

What was interesting about the tidal change evident in the model is that it had a complex spatial pattern - not just rising everywhere. This is important because it wouldn't be captured in a coarse ocean model - we need to simulate the whole delta."

The model also showed that in general the west of the delta and the Sundarbans mangrove forest got saltier, particularly during the dry season, which has important implications for the health of the forest and any crops planted there. The western part of the delta is also home to some of the poorest farmers in Bangladesh and the habitat of the Royal Bengal tiger.

The model uses and produces high-resolution 3-D maps of the delta, its rivers, and the Indian Ocean. Bathymetry data is used in conjunction with information describing river discharge, ocean tides, water temperature and salinity from other models. However, since the model is complex, and of high resolution (containing around 2 million nodes), the run-time is around one week per year of data. Therefore, the large 'supercomputing' power offered by the ARCHER supercomputing national facility is required.

Dr Judith Wolf, who is leading the NOC's contribution to this project, called ESPA Deltas, said: "This international collaborative research is pushing oceanography into new areas, working further inland than ever (to reach the limit of tidal penetration), and leading us to collaborations with human geographers and social scientists.

ESPA Deltas investigates how the bio-physical environment of the delta impacts on human population and livelihood for some of the poorest people on the planet. The NOC's role within this international multidisciplinary project is to contribute world-leading expertise in hydrodynamic modelling.

The following animations show coupled weather-fire behavior model simulations of the growth of wildfires. They were simulated using the CAWFE modeling system (CAWFE=Coupled Atmosphere-Wildland Fire Environment).

The state of Colorado is turning to the National Center for Atmospheric Research (NCAR) to establish the country’s most advanced system for predicting wildland fire behavior, including where and how quickly the blazes will spread.

Developed in response to legislation that Gov. John Hickenlooper signed in May, the new agreement finalized this month creates an innovative research and development partnership to generate real-time, 18-hour forecasts of active wildfires in the state. NCAR will work with the Colorado Division of Fire Prevention and Control’s new Center of Excellence for Advanced Technology Aerial Firefighting in Rifle to design and develop the system and begin testing it as early as next year. 

“This technology represents the next generation of wildland fire prediction,” said NCAR science manager William Mahoney, who worked with state officials on developing the new agreement. “It will capture some of the critical feedbacks between large fires and the local weather, which often result in extreme fire behaviors that threaten lives and property. Colorado is using homegrown technology to lead the nation in wildland fire prediction.”

The experimental forecast products will draw on powerful NCAR supercomputer simulations and newly available satellite measurements of fires, obtained with a technique developed at the University of Maryland. They will also incorporate observations from Colorado’s Multi-Mission Aircraft.  

Yarnell Hill Fire

Loading the player...

The Division of Fire Prevention and Control’s Center of Excellence is “excited to be working with NCAR to develop this stakeholder-driven technology,” said Center of Excellence Director Melissa Lineberger.

She added that the technology will be particularly valuable to Colorado because it is being developed with stakeholder input and firefighters’ needs in mind.

The system will provide unprecedented detail about interactions between weather and fire, which can create dangers for firefighters on the ground as well as for firefighting aircraft. It will build on a specialized supercomputer model that was developed at NCAR with support by the National Science Foundation, NASA, and the Federal Emergency Management Agency.

Once the system is fully developed and operational, it will be run by the Colorado Division of Fire Prevention and Control.

Tackling a major threat

Wildland fires are highly damaging in Colorado, taking the lives of firefighters and local residents, devastating large areas, and causing hundreds of millions of dollars in damage. Insurance claims from a single blaze, the 2012 Waldo Canyon Fire, totaled more than $450 million.

To better protect Colorado, state Rep. Tracy Kraft-Tharp (D-Arvada) and state Sen. Ellen Roberts (R-Durango) sponsored legislation earlier this year to fund development of the forecasting system.

“This is a revolutionary early-warning system that will better safeguard all of us for years to come,” Kraft-Tharp said.

The lessons learned from the Colorado system are expected to yield benefits for fighting wildfires across the western United States.

Capturing fire weather

Despite the lives and economic costs at risk, the techniques currently available for anticipating fire behavior remain similar to those of past decades. Typically, firefighters infer how fast the edge of a fire will expand based on terrain, fuels, and a measurement or estimate of the winds. But this approach cannot capture changes associated with the interaction of fire and weather.

To accurately forecast a wildland fire in detail, a supercomputer model has to simulate highly localized winds that drive the flames. Adding to the complexity, a major blaze alters local weather, creating winds within the fire that may be more than 10 times stronger than those outside. These internal winds can contribute to potentially deadly accelerations, increases in intensity, unexpected shifts in direction, or splits in which the flames go in multiple directions.

This interplay between fire and weather is particularly pronounced in Colorado and other western states, where clouds produce strong outflows and winds can rush down mountainsides and vary from one valley to the next.

To tackle this problem, the Colorado forecasting system will use a breakthrough supercomputer model developed by NCAR scientist Janice Coen, who has studied wildland fires for more than 20 years. NCAR’s CAWFE modeling system (derived from Coupled Atmosphere-Wildland Fire Environment) combines weather prediction with fire behavior simulations to capture the highly complex interplay of fire and weather.

By restarting the model every few hours with the latest satellite and aircraft observations of an active fire—a process known as cycling—Coen and her research partner, University of Maryland professor Wilfrid Schroeder, have shown that it is possible to accurately predict the course of a blaze over the next one to two days. They can keep refreshing the model, making it possible to simulate the entire lifetime of even a very long-lived fire, from ignition to extinction.

“Even though fires are complex and rapidly changing and often described as unpredictable, much of a fire event can be foreseen by this more sophisticated model,” Coen said.

Rubbish dumped at sea off Townsville will end up on the popular Mission Beach holiday spot, while Cairns' marine trash goes straight to the exclusive Port Douglas resort -- according to new supercomputer modeling by a James Cook University scientist. 

JCU's Kay Critchell fed local wind and tide data into the state-of-the-art SLIM modeling system. She then tracked drift patterns for an average-sized plastic water bottle that found its way into Townsville's Ross River or Cairns' Trinity Inlet, or was dumped at sea along the Great Barrier Reef. 

Rubbish from the Ross River washed ashore in the northern beachside suburb of Pallarenda, while plastic from Trinity Inlet headed for Port Douglas. The model showed plastic debris from a shipping lane off Townsville's Magnetic Island would land on the popular Mission Beach, about halfway between Cairns and Townsville. 

Ms Critchell said the findings were consistent. "For floating plastic the big driver was the wind. The main collection points were south or south-east facing beaches and those in close proximity to a river mouth." 

She said with limited resources available to beach clean-up crews, it's important their activities are targeted. "According to this study, the best use of their time would be to patrol beaches facing south or south-east after a big high-tide or storm." 

She said there were major differences between the respective ranges of waste that entered the ocean from rivers and that which came from shipping lanes. "The average distance travelled from a river mouth is 18.8 kilometres, from shipping sources it's 225 kilometres." 

Ms Critchell said while the Ross River was not the Ganges, it isn't a terribly good environment either. "I spent Friday with a group on the river bank along the shallows and we filled a truck with rubbish from the river in five hours. And there was plenty we couldn't get." 

She said the main thing to remember was that environments can be restored.

"We can use things like rubbish collection booms in the shallows that trap rubbish but have a low-impact on marine life, we can use waterwheels that scoop plastic waste out of the rivers, but these things take effort and are expensive. 

"What is most important is that the rubbish not get into the environment in the first place. It really comes down to personal responsibility -- people disposing of their rubbish properly. It's a huge and growing issue, but it's not hopeless." 

The next phase of the study will examine what happens to debris when it's washed out to sea again from its original destination beach.

The researchers tested how well they were able to predict winter sea ice changes by "hindcasting" past decades and then comparing their retrospective predictions to observations of what really happened. This image shows how the model stacked up to real life for the period of 1997–2007.

Climate scientists at the National Center for Atmospheric Research (NCAR) present evidence in a new study that they can predict whether the Arctic sea ice that forms in the winter will grow, shrink, or hold its own over the next several years.

The team of scientists has found that changes in the North Atlantic ocean circulation could allow overall winter sea ice extent to remain steady in the near future, with continued loss in some regions balanced by possible growth in others, including in the Barents Sea.

"We know that over the long term, winter sea ice will continue to retreat," said NCAR scientist Stephen Yeager, lead author of the study published online today in the journal Geophysical Research Letters. "But we are predicting that the rate will taper off for several years in the future before resuming. We are not implying some kind of recovery from the effects of human-caused global warming; it's really just a slow down in winter sea ice loss."

The research was funded largely by the National Science Foundation, NCAR's sponsor, with additional support from the National Oceanic and Atmospheric Administration and the U.S. Department of Energy.

Yeager is among a growing number of scientists trying to predict how the climate may change over a few years to a few decades, instead of the more typical span of many decades or even centuries. This type of "decadal prediction" provides information over a timeframe that is useful for policy makers, regional stakeholders, and others.

Decadal prediction relies on the idea that some natural variations in the climate system, such as changes in the strength of ocean currents, unfold predictably over several years. At times, their impacts can overwhelm the general warming trend caused by greenhouse gases released into the atmosphere by humans.

Yeager's past work in this area has focused on decadal prediction of sea surface temperatures. A number of recent studies linking changes in the North Atlantic ocean circulation to sea ice extent led Yeager to think that it would also be possible to make decadal predictions for Arctic winter sea ice cover using the NCAR-based Community Earth System Model.

Linking ocean circulation and sea ice

The key is accurately representing the Atlantic Meridional Overturning Circulation (AMOC) in the model. AMOC sweeps warm surface waters from the tropics toward the North Atlantic, where they cool and then sink before making a return south in deep ocean currents.

AMOC can vary in intensity. When it's strong, more warm water is carried farther toward the North Atlantic and Arctic oceans, accelerating sea ice loss. When weak, the warm water largely stays farther south, and its effects on sea ice are reversed. The variations in AMOC's vigor—from weak to strong or vice versa—occur over multiple years to decades, giving scientists some ability to predict in advance how it will affect winter sea ice, in particular. 

AMOC now appears to be weakening. Yeager and his co-authors, NCAR scientists Alicia Karspeck and Gokhan Danabasoglu, found in their new study that this change in the ocean is likely to be enough to temporarily mask the impacts of human-caused climate change and stall the associated downward trend in winter sea ice extent in the Arctic, especially on the Atlantic side, where AMOC has the most influence.

The limits of a short satellite record

The amount of sea ice covering the Arctic typically grows to its maximum in late February after the long, dark winter. The sea ice minimum typically occurs at the end of the summer season, in late September. The new study addresses only winter sea ice, which is less vulnerable than summer ice to variations in weather activity that cannot be predicted years in advance, such as storms capable of breaking up the ice crust.

Despite their success incorporating AMOC conditions into winter sea ice "hindcasts," the scientists are cautious about their predictions of future conditions. Because satellite images of sea ice extend only back to 1979, the scientists had a relatively short data record for verifying decadal-scale predictions against actual conditions. Additionally, AMOC itself has been measured directly only since 2004, though observations of other variables that are thought to change in tandem with AMOC—such as sea surface height and ocean density in the Labrador Sea, as well as sea surface temperature in the far North Atlantic—go back much farther.

"The sea ice record is so short that it's difficult to use statistics alone to build confidence in our predictions," Yeager said. "Much of our confidence stems from the fact that our model does well at predicting slow changes in ocean heat transport and sea surface temperature in the subpolar North Atlantic, and these appear to impact the rate of sea ice loss. So, we think that we understand the mechanisms underpinning our sea ice prediction skill."

It's 'Snowing like CRAZY up in here' and other tweets can boost computer models that guide traffic

That's the crux of a University at Buffalo study which examined how weather-related tweets can be analyzed to bolster supercomputer models that, among other things, recommend safe driving speeds and which roads motorists should avoid during inclement weather.

"It doesn't matter if someone tweets about how beautiful the snow is or if they're complaining about unplowed roads. Twitter users provide an unparalleled amount of hyperlocal data that we can use to improve our ability to direct traffic during snowstorms and adverse weather," said Adel Sadek, PhD, director of UB's Institute for Sustainable Transportation and Logistics, and the study's lead author.

Co-authors of the study, which was published in October in the journal Transportation Research Record, include Qing He, PhD, The Stephen Still Assistant Professor in Transportation Engineering and Logistics at UB; Jing Gao, PhD, assistant professor in the Department of Computer Science and Engineering at UB; Ming Ni, a PhD candidate at UB; and Lei Lin, who earned a PhD from UB in 2015.

Traffic planners rely on models that analyze vehicular data from cameras and sensors, as well as weather data from nearby weather stations.

The approach works, however, its accuracy is limited because traffic and weather observations do not provide information on road surface conditions. For example, the model does not consider ice that lingers after a storm, or that snowplows have cleared a road.

Twitter can help address this limitation because its users often tweet about the weather and road surface conditions, and many opt to share their location via GPS.

The study examined more than 360,000 tweets in the Buffalo Niagara region from 19 days in December 2013. Researchers identified roughly 3,000 relevant tweets by tagging keywords such as "snow" and "melt." 

Next, they refined the data via a method they call Twitter Weather Events Observation which classifies events in two ways. The first example, below, is a "weather utterance" while the second, also below, is a "weather report."

* The roads are a hot mess out in the burbs all over. Snowing like CRAZY up in here ... drive safe everyone.

* #BuffaloNY #Weather #Outside. #Cold #Snowing #Windy. @Parkside Candy http://t.co/IfyzICtGPW

Once the number of events reach a threshold for a given time, they are counted as a "Twitter weather event." Researchers tested the reliability of these events through metrics designed to eliminate tweets that do not match actual weather. Because the tweets contain geographic coordinates, researchers were able to map the exact locations of where the inclement weather was reported.

Next, they looked at the timing of the tweets and saw a pattern. When snow falls, the number of weather-related tweets increases, the average motor vehicle speed drops and traffic volumes slowly decrease.

Researchers then inserted the Twitter data into a model containing traffic and weather information, and found that the incorporation of such data improved the accuracy of such models. In particular, researchers found Twitter data to be more effective during the day (when more people tweet), and where the population is bigger (in the study's case, Buffalo has roughly five times more people than Niagara Falls, New York).

More precise models can usher in a host of improvements to freeways during inclement weather. For example, they will enable traffic planners to recommend better safe driving speeds, which roads need to be cleared of snow or avoided, and expected arrival times for motorists.

Researchers plan to continue improving their model by acquiring additional Twitter data for longer periods of time and at different locations.

Page 1 of 25