Hyperlocal Forecasting

Scripps researcher adapts global climate model to improve regional predictions

 

Catalina eddy

A demonstration of what the dynamical downscaling can achieve. The center figure is the coarse resolution analysis used to force the high resolution model. The left figure is the output from Kanamitsu's downscaling which produces an eddy. This eddy is famous in Southern California due to the very cloudy and cold weather during the May-June period. The right figure is the regional scale analysis performed by National Weather Service, which utilized local observations.

“You don't need a weatherman to know which way the wind blows," Bob Dylan famously sang. But if you want to know how the wind will blow tomorrow, odds are you’re going to check the forecast.

Atmospheric prediction has improved immeasurably in the 45 years since Dylan sang "Subterranean Homesick Blues." Whether you’re interested in tomorrow’s high or the global heat index a decade from now, forecasters can now predict the climate with far greater accuracy.

The rise of powerful high performance computers has been a large part of these improvements. Scientists isolate the factors that influence the weather (heat, radiation, the rotation of the Earth), transform them into mathematical formulae, and use supercomputers to forecast the atmosphere in all its multifarious complexity.

And yet, these forecasts are still painted with a fairly large brush. The global models — upon which all official predictions are based — have a resolution on the order of 100 kilometers (km) per grid-point. At that level, storms appear as undifferentiated blobs and towns in the mountains and the valley seem to experience identical weather.

“You can’t accurately examine how river flows have changed over the last 50 years, because one grid point may contain many rivers,” Masao Kanamitsu, a veteran of the atmospheric modeling world and a leading researcher at Scripps Institution of Oceanography, said.

Catalina eddyA recent study of the Catalina Eddy performed by Kanamitsu. The figure shows the 3-hourly evolution of the eddy during two days. Kanamitsu discovered that the eddy disappears during 00Z and 03Z, which was never been reported before. This was due to the lack of high time-resolution observations. This kind of analysis is only possible using the dynamically downscaled analysis.

Kanamitsu was a teenager in Japan in the 1960s when he read about the first computer weather forecasts. He knew immediately that computational forecasting was what he wanted to do. He worked his way through the world’s most advanced weather research centers, first in Japan, then in Europe and most recently in the U.S. In 1994, he published a paper, “The NMC Nested Regional Spectral Model,” with his colleague Dr. Henry Juang describing the regional spectral model, an idea for how to narrow the lens of forecasts, that is one of the most cited papers in the field.

In the early to mid-1990s, Kanamitsu used Cray supercomputers and Japan’s Earth Simulator to run climate simulations. Today, he uses the Ranger supercomputer at the Texas Advanced Computing Center (TACC), the second largest supercomputer on the National Science Foundation’s TeraGrid.

Kanamitsu and others in the atmospheric community use a process calleddownscaling to improve regional predictions. This technique takes output from the global climate model, which is unable to resolve important features like clouds and mountains, and adds information at scales smaller than the grid spacing.

“You’re given large scale, coarse resolution data, and you have to find a way to get the small scale detail. That’s downscaling,” he said.

Kanamitsu is using the method to create improved regional models for California, where small-scale weather patterns play a large role in dictating the environment of the state’s many microclimates. By integrating detailed information about the topography, vegetation, river flow, and other factors into the subgrid of California, Kanamitsu is achieving a resolution of 10 kiolmeters (km) with hourly predictions, using a dynamical method. This means that the physics of nature are included in the simulations — rain falls, winds blow — in contrast to statistical methods that use observational data in lieu of atmospheric physics.

“We’re finding that downscaling works very well, which is amazing because it doesn’t use any small-scale observation data,” Kanamitsu said. “You just feed the large-scale information from the boundaries and you get small-scale features that match very closely with observations.”

The tricky part involves representing the points where the forces outside California (the global model) interact with the forces inside the region (LA smog, for instance). These are integral for successful modeling, and difficult mathematically to resolve. Recent papers by Kanamitsu and his collaborators describe a nudging technique used to reduce the difference between the inner and outer conditions.

Kanamitsu is also tackling the problem of connecting atmospheric conditions with the ocean, which borders much of California, but is typically only included very coarse resolution ocean simulations.

“Along the coast of California, there’s a cold ocean that interacts with the atmosphere at very small scales,” he said. “We’re trying to simulate the ocean current and temperature in a high resolution ocean model, coupled with a high resolution atmospheric model, to find out what the impact of these small scale ocean states are.”

Catalina eddyUnderstanding Weather by O.G. Sutton, published in 1960, was an early inspiration for Kanamitsu.

Coupling these models requires very powerful and tightly connected computing systems like Ranger at TACC, which is one of the most efficient machines for producing long historical downscaling in a short period of time. Kanamitsu's simulations improve upon those currently in use by the National Weather Service, leading to 10 papers on the subject in 2010.

Researchers have already begun applying the downscaling to fish population studies, river flow changes, and wind energy applications.

“Kanamitsu's model simulations have enabled a much better resolved picture of mesoscale processes affecting wind flow and precipitation in the contemporary, historical period in California," noted Scripps hydrometeorologist Daniel Cayan, one of Kanamitsu’s colleagues. They also provide insight into regional scale climate changes that may occur over the State.”

 

Even Kanamitsu’s 10km resolution models aren’t accurate enough to answer some important questions.

Masao KanamitsuMasao Kanamitsu, a researcher at Scripps Institution of Oceanography.

“Applications for wind power, we found out, require very high resolution models,” Kanamitsu said. “The wind is very local, so the ultimate goal is one kilometer resolution.”

The time-scale resolution is an important factor as well. Severe weather events like tornadoes happen fast, and achieving shorter time-scale forecasts is another of Kanamitsu’s goals.

The list of factors that could be added to the models is endless, but over the course of his long career, Kanamitsu has witnessed how improved computer modeling has changed his field.

“Thirty years ago, I was one of the forecasters,” he recalled. “There was a forecaster’s meeting every day and we took our computer model results to the meeting. Back then, the forecaster in charge didn’t look at or believe in our results. But now, forecasters believe in the models so much that some people think they’re losing their skill.”

As scientists seek to determine the local impact of global climate change and address the issue, an accurate historical record and sophisticated regional forecasts like those facilitated by Kanamitsu’s work are increasingly crucial.