Human

The science of weather

 By RP Siegel

According to the UN, natural disasters have killed more than 600,000 people and left behind trillions of dollars in damages in the last twenty years. Thanks to advances in technology, that number has been declining, hitting an all-time low in the U.S. in 2014. While disasters and deaths often make the headlines, most people are not aware of the numerous catastrophes that have been avoided, thanks to advanced warning, made possible by weather forecasting. RP Siegel looks at how the science of weather forecasting has changed over the years, thanks to innovation and technology, and the drastic impact it will continue to have on people’s lives and property…

(Cover photo by www.newsweek.com)

While natural disasters and deaths tolls often make the headlines, most people are not aware of the numerous catastrophes that have been avoided thanks to advanced warning and weather forecasting. According to the World Bank, investments in meteorological agencies could save as many as 23,000 lives each year, along with $30 billion in economic benefit.

People have been observing, measuring and trying to predict the weather since the 1400s. According the NOAA History page, rain gauges, barometers, anemometers, hygrometers and thermometers, all appeared in the years between 1400 and 1700. But it was only with the invention of the telegraph in the 1840s — by which weather information could be quickly shared across wide areas so that patterns could be discerned, and approaching storms detected — that modern weather forecasting truly began. The first regularly published forecasts by the National Weather Service began in 1871.

Back in 1904, the Norwegian mathematician Vilhelm Bjerknes first proposed using the principles of physics to predict the weather. It was Bjerknes who laid out the conditions required for forecasting. First, “the condition of the atmosphere must be known at a specific time with sufficient accuracy.” Second, “the laws must be known, which determine the development of one weather condition from another.” The idea was there, but the size of the calculation proved unworkable. It was only after WWII when computing pioneers such as John Van Neumann claimed that computers would be able to predict weather patterns.

Meanwhile, radar technology, which was developing in the same time frame, showed promise. However, it was costly, and shrouded in secrecy by the military. Defense radar installations used for weather forecasting were first installed on both coasts of Panama in 1943.

(Imagine by www.bloomberg.com)

After the war, storm spotter networks sprang up in Texas and the Midwest. They raised the question of using radar to identify tornadoes. After a number of damaging hurricanes hit the East Coast in the mid-50s, and the successful use of three National Weather Service radar installations to provide early warnings, there was a surge of interest in weather radar. But it wasn’t until the 1960s with the advent of solid state electronics, that television stations first installed their own weather radar systems. The trend broadened in the 1970s, moving from local radar to remote operated radar stations to give a far more complete view of approaching weather.

Today, the practice of weather forecasting still continues to evolve along the path first laid out by Bjerkes. There is the engineering of systems for collection and transmission of detailed data that instantaneously captures weather conditions across a wide area of interest, as well as the science of interpreting that data, both by meteorologists, as well as computers, using the laws of physics to forecast how one set of conditions will give rise to a new set. A third element in the equation is the question of how much advanced warning an area needs in order to prepare for a catastrophic weather event. The combination of these three, determine the “weather readiness” of an area, an increasingly critical factor at a time of growing storm intensity. According to UN data, more than 600,000 people have died in catastrophic storms in the past twenty years.

In a situation where every minute could mean the difference between life and death, the best available technology in 2012 could provide a direct warning of a tornado strike 14 minutes ahead of time on average. Researchers at the National Weather Service are optimistic that in ten years, that number will be extended to one hour.

Early radar reflected off particles in the atmosphere and upon returning could reveal the location and intensity of precipitation. The development of Doppler radar in the late 80s added the ability to determine the direction and speed with which precipitation is moving, a major improvement.

US meteorology students with weather balloon, 1953

Dual polarization allowed forecasters to differentiate between drops of rain and hail. First deployed in 2011, it can also be used to identify flying debris in a hurricane. These radar systems can also more accurately assess precipitation rates, which can be very helpful in predicting floods.

Because time is such a critical factor, radar researchers are also looking at ways to speed things up. Conventional Doppler radar scans the sky in pie-shaped increments, advancing only when each scan is complete. A complete scan can take anywhere from 4-6 minutes to update from the previous scan. Phased-array radar, which has been used in defense applications, sends out multiple beams at once, therefore reducing scan time to less than a minute. This improvement could increase warning times up to 18 minutes. Smart, or adaptive scanning algorithms could reduce that time even further.

Other area where improvements have been made in addressing Bjerknes’ first condition are weather balloons that sample conditions and wind speed (NASA launches over 200 of these per day) and dropsondes, which are small sampling devices that are dropped out of airplanes containing similar instrumentation. Both of these “phone in” their readings using radio transmissions.

Then, of course, there are satellites. Over 90 percent of the more than 210 million observations per day that are fed into weather forecasting computers, comes from satellites. There are two basic types: geostationary and polar-orbiting. Geostationary satellites remain in a fixed position some 22,000 miles (35,406 kilometers) above the Earth’s surface and transmit nearly continuous images of the section of the Earth they are watching. Polar orbiting satellites are in Low-Earth orbit (LEO), about 515 miles (829 kilometers) high. An integrated network of these can provide a complete view of the Earth’s weather, updated every 12 hours.

These satellites are in the process of being updated. Newer LEO systems will contain improved microwave and infrared sensors, capable of providing a highly-detailed, three-dimensional map of the atmosphere, resulting in more accurate forecasts and better prediction of storms. Likewise, the Geostationary satellites will soon be upgraded with the GOES-R series systems that will update every five minutes, three times faster than before. This will provide better insight into rapidly developing storms as well as lightning behavior.

SEE MORE: Day after disaster by Nicholas Newman

flood-generator

With all this technology available to know “the condition of the atmosphere,” with “sufficient accuracy,” what remains is Bjerknes’ second condition, which is to apply the laws of physics, to “determine the development of one weather condition from another.” All this data coming in divides up the Earth’s surface in squares, roughly 5 to 30 miles (8 to 48 kilometers) across, producing an estimated 15 million fields. It’s a good thing there are now supercomputers available to crunch all that up at rates anywhere from 14, to as many as 73 trillion calculations per second.

Early data processing runs for weather data took hours to complete. Now they are finished in seconds. To move to finer grids, which can give much more detail in storm situations, the number of calculations will increase enormously, requiring even faster computers. Bill Lapenta of NOAA’s Environmental Modeling Center hopes to see that happen by 2020.

Turning all this data into meaningful and accurate forecasts, takes intelligence of the kind that computer models don’t yet have, though researchers are working on that too.

In addition to improved data processing, there is still human learning as well, as forecasters use heuristics to recognize patterns in these massive amounts of image data. For example, a unique pressure pattern, known as the tornado vortex signature, was first identified in the 70s. Eventually this type of knowledge gets encoded into computer models that are then applied to this mountain of data.

That’s where the battleground is today, with a number of weather prediction services, both private and public, touting their models as being the best and most accurate of all. Many are based on the National Weather Service forecast, with some added analysis and predictions. Others consolidate forecasts from a number of major models, using their own proprietary technology that factors in historical weather data.

People will likely always talk about the weather, but there may come a time when its mystery will be a thing of the past.

about the author
RP Siegel
Skilled writer. Technology, sustainability, engineering, energy, renewables, solar, wind, poverty, water, food. Studied both English Lit.and Engineering at university level. Inventor.