Stable and fast: ACE, the Ai2 Climate Emulator
Brian Henn / December 12, 2023
Ai2 contributors: Oli Watt-Meyer, Jeremy McGibbon, Gideon Dresdner, Spencer Clark, Brian Henn, James Duncan, Matt Peters, and Chris Bretherton
NVIDIA collaborators: Noah Brenowitz, Karthik Kashinath, Mike Pritchard, and Boris Bonev
The last year has seen a revolution in the field of weather prediction. By using deep learning methods, multiple research groups have shown improvements in key metrics over conventional physics-based medium-range weather forecasting systems. Leveraging these breakthroughs, for example, the leading European weather prediction center has begun making AI-driven operational weather forecasts alongside its conventional physics-based forecasts. These AI-driven forecasts typically require far fewer computational resources than the conventional approaches.
Climate modelers use similar physics-based models run for decades or centuries to generate statistics of the simulated climate. These runs are used to estimate the impacts of increasing greenhouse gas concentrations. Conventional climate forecasts are computationally very demanding as a result, and so using AI-driven models for climate has great potential to expand access to climate simulations by reducing computation costs. But the applicability of the recent weather-focused AI models to climate is not clear, as nearly all have reported forecasts out to only 14 days, and instabilities or artifacts often appear in longer simulations.
ACE: a stable and fast climate emulator
Here we present ACE (AI2 Climate Emulator), a neural network-based atmospheric model. ACE uses the Spherical Fourier Neural Operator architecture and is trained to emulate an existing physics-based atmospheric model to predict global outputs every 6 hours. ACE runs stably for at least 100 years and can simulate a decade in one hour of wall clock time, nearly 100 times faster and using 100 times less energy than the reference atmospheric model.
ACE replicates the near-surface climatology of the reference physics-based model better and runs faster than a 2x coarser but otherwise identical model. Finally, ACE is framed so that precise evaluation conservation of mass and moisture is possible, and we find that moisture for a given vertical model column is very nearly conserved across individual timesteps.
Long-term stability
ACE is able to maintain a stable simulation and unbiased global evolution of temperature and moisture for simulations of 10–100 years. Fig. 1 shows the global-mean lower-atmosphere temperature and total atmospheric moisture for 10-year reference and ACE simulations, in which ACE tracks the reference closely and maintains a similar climate across all years.
Small Climate Biases
ACE is able to produce a realistic climate as evidenced by small errors in the time-mean spatial patterns of predictions. Fig. 2 shows the spatial pattern of surface precipitation biases in ACE relative to the reference simulation, computed over a 10-year simulation. The global-mean bias is near zero and the error magnitude is 0.47 mm/day, which is smaller than the error of a physics-based simulation run at 2x coarser spatial resolution than the reference simulation.
Moisture Conservation
Finally, ACE was formulated and trained on data that allow for evaluation of its conservation of dry air mass and moisture across the globe. For a given atmospheric column, for example, the change in total moisture over a timestep should be equal to the evaporation into the column plus the net moisture advected in by winds minus the precipitation out. The reference physics-based model satisfies this constraint by design, but an AI emulator may not unless constrained to do so.
Nonetheless, ACE nearly obeys the column-wise conservation of moisture. Fig. 3 shows that the magnitude of the budget violation in ACE is very small compared to the individual terms in the budget; this is true even one year into the simulation.
Going forward
ACE demonstrates the potential of deep learning for skillful and fast climate model emulation. 100x speed-up in run time and 100x greater energy efficiency could democratize the use of climate models, open new research avenues, and potentially reduce energy usage. Further challenges must be addressed before this system becomes a useful climate model, including training the model on data that reflects a changing climate, correcting prediction errors that result from training on climate model data as opposed to an observational dataset, and coupling to other components of the climate system, such as ocean, land, and sea ice. Tackling these challenges is an exciting opportunity for the growing field of AI-based climate modeling.
Explore further!
ACE arxiv paper and code
Acknowledgments: The authors acknowledge NOAA's Geophysical Fluid Dynamics Laboratory for providing the computing resources used to perform the reference physics-based model simulations, which were completed using NOAA's FV3GFS, the atmosphere component of the United States weather model. This research also used resources of NERSC, a U.S. Department of Energy Office of Science User Facility located at Lawrence Berkeley National Laboratory, using NERSC award BER-ERCAP0024832.