Keywords: Atmospheric Dynamics, Parameterizations, Climate Modelling, Foundation Model, ERA5, Finetuning, Machine Learning
TL;DR: We develop the first climate model parameterization by fine-tuning, Prithvi WXC a novel weather and climate foundation model, demonstrating the versatility of Foundation Models for weather and climate applications.
Abstract: Climate prediction models parameterize a range of atmospheric-oceanic processes like clouds, turbulence, and gravity waves. These physical parameterizations are a leading source of uncertainty and strongly influence future projections of global temperature rise. We present a fresh approach to developing parameterizations for coarse-climate models by leveraging pre-trained AI foundation models (FMs) for weather and climate. A pre-trained encoder and decoder from a 2.3 billion parameter FM (NASA and IBM's Prithvi WxC) --- which contains a latent probabilistic representation of atmospheric evolution --- is fine-tuned to create a data-driven predictor of atmospheric gravity waves (GWs). Current climate models are not fine enough to resolve GWs. We create an ML-based parameterization that learns GW fluxes from high-resolution ``GW resolving" climate models to represent them in "GW missing" coarse-climate models. The fluxes predicted by our fine-tuned model are comprehensively evaluated using a set of three tests. Comparison with a baseline (Attention U-Net) reveals the superior predictive performance of the fine-tuned model throughout the atmosphere. The model outperforms the baseline even in regions excluded from the FM pre-training. This is quantified using the Hellinger distance which is 0.11 for the baseline and 0.06, i.e., roughly half, for the fine-tuned model. FMs are largely unexplored in climate science. Our findings emphasize their versatility and reusability to accomplish a range of weather- and climate-related downstream applications, especially in a low-data regime. These FMs can be further leveraged to create new parameterizations for other earth-system processes.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1934
Loading