AI-accelerated climate modeling

Go back to programme

AI-accelerated climate modeling

  • Watch

    * Register (or log in) to the AI4G Neural Network to add this session to your agenda or watch the replay

  • While climate change is certain, precisely how climate will change is less clear. Breakthroughs in the accuracy of climate projections and in the quantification of their uncertainties are now within reach, thanks to advances in the computational and data sciences and in the availability of Earth observations from space and from the ground. I will survey the design of a new Earth system model that the Climate Modeling Alliance (CliMA) is developing. The model exploits tools from machine learning and data assimilation jointly with process-informed models, with the goal of achieving a new level of accuracy in modeling important small-scale processes such as clouds and precipitation.

    Shownotes: 

    [00:00] Opening remarks by ITU

     

    [02:29] AI-ACCELERATED CLIMATE MODELING

    • Allianz of 70 entities, institutions, universities and researches.

    [03:49] Global damages from climate-related disasters

    • Exceeded 210 billion in 2020 and are increasing.

    • Disasters are exacerbated by climate change.  

     

    [04:45] Mitigating climate change is essential

    • Adapting to it is unavoidable and has a large benefit-cost ratio. 

    • Strengthening early warning systems.

    • Making new infrastructure resilient.

    • Improving dryland agriculture crop production.

    • Protecting mangroves.

    • Make water resources more resilient.

     

    [05:53] Challenges 

    • Climate models do not provide the accurate local information needed for proactive adaptation. 

    • Global projections of extreme temperature are too uncertain because of uncertainty. 

     

    [06:39] Example: CO2 concentration at which 2ºC warming threshold is crossed varies widely across models. 

    •  

     

    [08:24] Primary source of uncertainty 

    • Climate predictions is the representation of low clouds in models. 

    • Stratocumulus: colder

    • Cumulus: warmer

    • We don’t know if they will get more low clouds with rising CO2 levels. 

     

    [10:23] Clouds in climate predictions: Why are they difficult but important? 

    • Critical role in energy balance of the Earth. 

     

    [10:39] The small-scale cloud-controlling processes cannot be resolved globally in climate models. 

    • Turbulence and convection that sustain the clouds has small length scales for the low clouds. 

    • Clouds fall through the crack of the computational mesh. 

    • Represent them in a semi-empirical way. 

     

    [11:42] Difficulties in modeling low clouds

    • Result: Failure to simulate present climate. 

     

    [13:01] When will brute-force high-performance computing solve this problem

    • Can we compute Newton’s and Thermodynamics laws to calculate cloud’s motion? 

     

    [13:13] Computer performance has been increasing exponentially

    • Increasing in 1.2 every 2 years

     

    [13:59] Global low-cloud resolving models will not be feasible for decades

    • a factor 10^11 problem. 

    • Resolution of the atmospheric component of climate models: 1/Km.

    • For every increase in resolution, it corresponds to the factor 10^4 in Gflops.

     

    [16:08] Data and AI to the rescue.

    • There are special challenges for climate projections. 

     

    [16:52] Requirements for data-informed climate models

    • Generalizability: Predict the climate without observed analogue, so need to generalize out-of-sample

    • Interpretability: Able trust models that cannot immediately be verified with climate-change data. This is essential.

    • Uncertainty Quantification: Adaptation requires risk estimate, hence probabilistic predictions with quantified uncertainties.

     

    [18:14] Combine reductionist science with data science to accelerate climate modeling 

    • Deep learning: success in expressive models and data-hungry methods. 

    • Challenges: generalizability, interpretability and uncertainty.

    • Success of reductionist science rests on parametric sparsity: generalizable and interpretable.

    • Current approach: tradition reductionist science with data science tools.

     

    [20:17] How does this work? 

     

    [20:25] Three crucial ingredients

    • Advance theory: Use known equations as far as possible. 

    • Harness data: Exploit detailed earth observations now available, together with data generated computationally.

    • Leverage computing power: Transition to hardware with accelerators is an opportunity. (GPUs, TPUs)

     

    [23:05] Abstract model

    •  Take theory (equation) 

    • Functions are target for learning from data

    • Data can come from anywhere. 

    • Simulation: compute situations that you cannot observe, such as low clouds to generate data. 

     

    [24:15] How does it actually work in modeling clouds

     

    [24:43] How to model clouds?

    • We use a unified, physics-based model, derived by conditional averaging of equations of motion. 

    • Decomposes domain into environment and coherent plumes. 

    • An equation of continuity is used, scalar mean and scalar covariance also. 

    • Closure functions are the targets for learning from data. 

     

    [27:34] Closure functions in the coarse-grained equations are natural targets for ML

    • Entrainment and detrainment: represented by a physical entrainment length and functions of nondimensional parameters learned from data. 

    • Non Hydrostatic pressure gradients: represented by a combination of buoyancy reduction and pressure drag, with coefficients learned from data. 

    • Eddy diffusion: Mixing length as soft minimum of all possible balances between production and dissipation of TKE

     

    [28:46] Low cloud cover bias in a typical current climate model: problem areas especially in subtropics and polar regions. 

    • Biases can have an enormous impact on climate change.

     

    [29:54] Reduced-order model with 9 parameters calibrated against high-resolution simulations captures polar and subtropical boundary layer and clouds. 

    • Examples on how well the model covers clouds in simulations.

    • Parameters such as humidity, velocity, convection can be emulated.  

     

    [33:09] To scale this success up to an entire earth system model that learns from data, we need fast learning algorithms

    • Fast algorithms face many challenges: limits of the functions e.g.

     

    [34:09] At CLiMA, we are pursuing the same approach for all components of a new Earth system model. 

    • Group of 70 scientists, mathematicians, researchers, etc.

    • They work on topics such as the ocean, land, ice, atmosphere, etc. 

    • Joining all components will give up data to learn from and create simulations. 

     

    [35:02] We want the model with all component to learn jointly from data

    • Learn from statistics accumulated in time.

    • Statistics include mean fields and also higher-order statistics. 

     

    [35:35] Learning from climate statistics

    • Represents a good opportunity and challenges.

    • Matching statistics results in smoother objective functions than matching trajectories.

    • Climate-relevant statistics such as covariances between cloud over and temperature and precipitation extremes can be included in loss function. 

    • Loss function evaluation is extremely expensive. 

     

    [37:23] Our setting for learning about parameters

    • Find a parameter theta from data y (function).

    • Model G maps the data (theta) G(theta) and adds Gaussian noise. 

    • Calibrating and UQ for theta are both important

    • G is expensive to evaluate. 

    • G is only approximately available.

     

    [38:44] We combine calibration and Bayesian approaches in a three step process to solve G challenges. 

    • Calibrate: first step

    • Emulate: second step

    • Sample: third step

    • Experimental design can be incorporated into the CES pipeline.

    • Gives approximate Bayesian posterior.

     

    [42:23] Proof-of-concept in idealized general circulation model (GCM)

    • GCM is an idealized aquaplanes model. 

    • It has a simple convection scheme that relaxes temperature and specific humidity to reference profiles.

    • Two closure parameters: timescale (t) and reference relative humidity (RHref).

     

    [43:10] (1) Calibrate with ensemble Kalman inversion 

    • Objective function has relative humidity, mean precipitation and precipitation extremes. 

    • Ensemble Kalma inversion for parameters in convection scheme: ensemble of size 100 converges in approx. 5.

     

    [44:04] (2) Emulate parameters-to-statistics map during calibration step with Gaussian processes

    • Effective emulation of model staticsti at vanishing marginal cost; additional important advantage: smoothing of objective function. 

     

    [45:28] (3) Sample emulator to obtain posterior PDF for uncertainty quantification

    • Bayesian learning at 1/1000th the cost of standard methods. 

    • In the Blue shade: density and black dots 

     

    [46:24] Draw an ensemble of climate predictions from the posterior of parameters for UQ of predictions. 

    • Quantifying risk in tails of distributions is crucial for assessing risk of climate impacts. 

    • Can also incorporate sparse learning about structural model error. 

     

    [48:36] We are pursuing the same approach for all components of an all-new Earth system model

    • 2023 goals:

    • Build a model that learns from observations and high resolution simulations.

    • Achieve improved simulations of the present climate.

    • Provide predictions with UQ based on observations and high-resolution simulations. 

     

    [49:19] CliMA will provide a hub for actionable climate information to plan a resilient future

    • Provide fine-grained climate projection on demand: statistics of extreme events and extreme scenarios.

    • Anchor ecosystem of apps for detailed predictions of flood risks.

    • Actionable information to facilitate resilience throughout public and private sectors. 

     

    [51:04] Conclusions

    • Reducing and quantifying uncertainties in climate models is urgent but within reach.

    • To reduce and quantify uncertainties, combine process-informed models with data-drive approaches harnessing climate statistics. 

    • Sparsely parameterized, physics-based subgrid-scale models. 

    • Learn both from observations and from high-resolution simultaneous spin off on the fly. 

    • Calibrate-emulate-sample. 

     

    [55:28] Q&A session

     

    [55:39] How confident are you with learning parameters as the climate shifts and how can it be with training data? 

    • You need to predict something when you don't have training data. Use short-term prediction models, where you can be right or wrong on reasonably short time scales is the best way to get trust. 

     

    [56:57] If you mostly train from satellites, would you have to bring another more process based data? Data from last climate is so sparse and learning from them is not enough information to do a simulation about but you can use it as a sample.

     

    [58:52] Are there ways for startups and the private sector to collaborate on this topic? 

    • These kinds of institutions can bring stability to do research on this topics and they can bring transparency and other features. 

     

    [1:00:35] Merging physics and ML is more meaningful for short-term/long-term prediction? 

    • It is more important for the long-term. 

     

    [1:01:25] Could someone learn more about the model structure itself by some of the mentioned approaches different from yours? 

    • What we are trying to do in our approach is to learn from statistics rather than from trajectories. The  space in all these structure lives is high dimensional so I am not sure. 

     

    [1:03:02] Closing remarks from Philip Stier

     

    [1:03:24] Closing remarks by ITU

     

    Share this session
    In partnership with

    Are you sure you want to remove this speaker?