everyone
since 04 Oct 2024">EveryoneRevisionsBibTeXCC BY 4.0
Climate prediction models parameterize a range of atmospheric-oceanic processes like clouds, turbulence, and gravity waves. These physical parameterizations are a leading source of uncertainty and strongly influence future projections of global temperature rise. We present a fresh approach to developing parameterizations for coarse-climate models by leveraging pre-trained AI foundation models (FMs) for weather and climate. A pre-trained encoder and decoder from a 2.3 billion parameter FM (NASA and IBM's Prithvi WxC) --- which contains a latent probabilistic representation of atmospheric evolution --- is fine-tuned to create a data-driven predictor of atmospheric gravity waves (GWs). Current climate models are not fine enough to resolve GWs. We create an ML-based parameterization that learns GW fluxes from high-resolution ``GW resolving" climate models to represent them in "GW missing" coarse-climate models. The fluxes predicted by our fine-tuned model are comprehensively evaluated using a set of three tests. Comparison with a baseline (Attention U-Net) reveals the superior predictive performance of the fine-tuned model throughout the atmosphere. The model outperforms the baseline even in regions excluded from the FM pre-training. This is quantified using the Hellinger distance which is 0.11 for the baseline and 0.06, i.e., roughly half, for the fine-tuned model. FMs are largely unexplored in climate science. Our findings emphasize their versatility and reusability to accomplish a range of weather- and climate-related downstream applications, especially in a low-data regime. These FMs can be further leveraged to create new parameterizations for other earth-system processes.