October 16, 20194 min read

MicroWeather: Forecasting’s Great Leap Forward

Weather Mapping’s Many Moving Parts

Forecasting the weather is a massive and expensive undertaking. Dozens of governments must invest billions of dollars annually in satellites, wind vanes, lasers and whiz-bang computing. Until recently, precious few government agencies around the world had the funding and infrastructure capacity necessary to collect the data and build the models needed to get a read on the weather.

But that’s all changing before our eyes.

First, let’s review what have traditionally been the three major steps to mapping and getting the word out about our weather.

Step 1 – Government Observes

What’s going on in the atmosphere, the world’s oceans and other major earth systems right now? To answer these questions, we need to observe atmospheric variables like temperature, humidity, wind and barometric pressure. Since the atmosphere and weather are chaotic systems, even small errors in understanding their initial conditions can lead to gross errors in forecasting.

The sensors used to conduct weather observations are expensive to build, calibrate and operate. The two environmental satellites that NOAA is launching over the next few years – the GOES-T and GOES-U – will cost close to $11B. As a result of these costs, top notch sensing equipment is deployed almost exclusively by wealthy, developed countries. The US has approximately 4,000 weather observation stations. In contrast, the Democratic Republic of Congo, with ¼ the land mass and population size, only has five–and it’s questionable as to whether they’re calibrated and working.

Step 2 – Mostly Government Models

Next, the observations – mostly gathered by public entities – are fed into complex computer models that first try to figure out current weather conditions and then come up with forecasts.

Due to the sheer complexity and computational power of these models, they’re run by only a handful of mostly government agencies, along with a relatively small number of private sector companies (including ClimaCell). Once run, the outputs – generally global and local forecasts of several hours to several weeks out – are made available for public consumption.

Step 3 – Private Sector Disseminates

After the public sector has done the bulk of the work in observing and modeling, private entities turn the outputs into business- and consumer-friendly weather reporting.

Nearly every company relies on the same data, though it may well be tweaked, repackaged and processed differently. Everyone who relies on government data is thus limited in what they can provide to their customers.

Great Leap Forward: Meet MicroWeather

The problem is that because our climate is becoming ever more volatile, and observations are scarce and expensive to deploy, the limitations of relying on government data have become increasingly evident. Traditional weather forecasts are dependent on the speed at which governments move and priorities they want to invest in.

This is where ClimaCell comes in. We use patented MicroWeather technology to aggregate data from the connected world and assimilate it into a stronger observational network. Wireless devices, connected vehicles, UAVs and other non-traditional technologies are combined to create an interconnected Weather of Things system.  

As a result, ClimaCell weather technology provides access to hyper localized and continuously updated data, based on a worldwide and constantly growing network of virtual sensors. And since MicroWeather doesn’t require expensive investments in infrastructure and hardware, it can be quickly deployed and used by developing countries that need it most. 

Read more on “Where Do Weather Forecasts Go Wrong?” >