James Burn, Senior Solutions Engineer and Meteorologist at IBM, explores how computers, and latterly AI, are used to calculate climate change and weather.

In today's world, the impact of accurate weather forecasting and climate change has become increasingly important. Advanced computer systems play a crucial role in analysing vast amounts of data and performing complex calculations to help us understand climate trends and predict weather conditions. This article provides a brief background to weather modelling, explaining how computers are currently employed in the fields of climate science and meteorology and discussing the adoption of machine learning in climate science and weather operations.

## Weather and climate modelling

In 1922 mathematician and physicist Lewis Fry Richardson proposed modelling weather with an amphitheatre full of mathematicians and scientists. They were arranged on a map basis, working to solve the equations of fluid motion for different parts of the atmosphere to provide forecasts and passing this data on to their next region. This physics-based approach to weather forecasting formed the foundation for much of the progress made in the field in the following 100 years.

As electronic computers developed, replacing Richardson’s team of slide rule wielding mathematicians, meteorology leapt forward as a science. The UK Met Office produced its first operational weather forecast using a computer in 1965, and weather and climate forecasting are now one of the leading uses of supercomputing technology around the globe.

## How weather modelling works

What is the difference between weather and climate modelling? It’s really only a matter of time! Weather is short term, and climate long term. So, weather models forecast the conditions of the atmosphere over a short period of time – some only a few hours, and others up to 2 weeks ahead. Sub-seasonal and seasonal weather models cover from 2 weeks to a year ahead. Climate forecast models run 10s of years ahead, often exploring different future scenarios. Many scientists define climate as the average weather for a region, taken over 30 years.

Weather/climate models divide the model region (globe or smaller region) from surface to top of atmosphere into a three-dimensional grid. To give an idea of the grid size, the model from the European Centre for Medium Range Weather Forecast (ECMWF) has a 9km global grid with 137 vertical layers up to 64km at the top of the atmosphere. An estimate of the state of the atmosphere at each of these billions of 3D grid points initialises the model with data such as temperature, pressure, humidity, and wind observations. The model then uses mathematical equations to represent the physical processes governing the behaviour of the atmosphere changing with time. This data is then fed into the grid for the next time-step, and the equations run again to gradually step forward and forecast the weather or climate.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

## Data assimilation

Weather forecasting relies on the assimilation of real-time data into numerical models. Computers collect data from satellites, weather stations, radar and other sources, combining them with historical observations. This process, known as data assimilation, helps refine the initial conditions for weather models, increasing the accuracy of short-term forecasts. Because of the quantity of data being processed and the advanced statistical methods used to optimise and process it, this is one area where machine learning is giving some improvements to traditional methods.

Parametrisation is a way to replace and supplement physical processes which are too small scale to represent in the model. We can think of them as models within the model, to represent weather like convective shower development, raindrop descent and radiative transfer. These parameterisation schemes allow the inclusion of wider specialised meteorological scientific research into the overall model code-base.

## Ensemble, probabilistic weather modelling

This is where multiple instances (ensembles) of the model are run, with small changes in the starting conditions. This simulates the uncertainty which is inherent in meteorological science. The models are left to run out independently, and will generate multiple future forecasts based on the same starting time and very slightly different starting conditions. Each forecast is then analysed statistically to extract the most likely forecast scenario and produce confidence metrics based on distribution of the outlying scenarios. If you have a forecast ‘percentage likelihood of rain’, this should be derived from a probabilistic forecast – counting the percentage of models which forecast rain at your location.

## Supercomputing power

The data assimilation, equation solving, parametrisation, and ensemble components of climate modelling and weather forecasting are incredibly computationally intensive. Supercomputers, with their immense processing power and parallel computing capabilities, enable scientists and meteorologists to tackle these complex computations efficiently. High-performance computing facilities allow for faster model simulations, increased resolution, and the processing of vast ensembles of climate and weather scenarios.

## Machine learning techniques

Recently, machine learning (ML) techniques have increasingly been used to enhance weather/climate modelling. ML is embedded in data assimilation and also used to enhance and interpret the output — with ML data-based flood models, for example. There is still suspicion and concern around a pure data-based non-physical model creating forecasts for public and major operational consumption. There are also pure ML based weather models, though these are not operational… yet! This is where meteorologists, who have for years been the trusted advisors and science communicators, must equip themselves with the expertise to work in the new AI world.

## Conclusions, challenges and the future

Computers are indispensable tools in the fields of climate science and meteorology. They enable scientists and meteorologists to analyse vast amounts of data, run complex simulations, and provide accurate weather forecasts and climate predictions. While computers have revolutionised climate science and meteorology, several challenges persist. One such challenge is the need for even more powerful computing resources to improve model resolution and accuracy. Additionally data quality and availability, as well as uncertainties associated with model predictions, remain important areas of focus for further research and development.

By harnessing the power of high-performance computing, we can continue to better understand climate change, predict severe weather events, and make informed decisions to mitigate the impacts of a changing climate now and in the future.

The code in weather and climate models is often written with the programming languages Fortran and C. Fortran was launched by IBM in 1957 and was the first “high-level” programming language. These languages take commands like a human language rather than being written in a machine code – a stream of numbers/letters – so opened programming access to the broader scientific community.

The UK Met Office’s Unified Model is written in Fortran and is used for both weather and climate modelling.