Abstract: Downscaling is essential for generating the high-resolution climate data needed for local
planning, but traditional methods remain computationally demanding. Recent years have
seen impressive results from AI downscaling models, particularly diffusion models, which
have attracted attention due to their ability to generate ensembles and overcome the
smoothing problem common in other AI methods. However, these models typically remain
computationally intensive. We introduce a Hierarchical Diffusion Downscaling (HDD) model,
which introduces an easily-extensible hierarchical sampling process to the diffusion framework.
A coarse-to-fine hierarchy is imposed via a simple downsampling scheme. HDD achieves
competitive accuracy on the ERA5 reanalysis dataset and CMIP5 models, significantly
reducing computational load by running on up to half as many pixels with competitive
results. Additionally, a single model trained at 0.25° resolution transfers seamlessly across
multiple CMIP5 models with much coarser resolution. HDD thus offers a lightweight
alternative for probabilistic climate downscaling, facilitating affordable large-ensemble high
resolution climate projections; with a single model that can be applied across GCMs of
varying input sizes. See a full code implementation at: https://github.com/HDD/HDD
Hierarchical-Diffusion-Downscaling
Submission Type: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Russell_Tsuchida1
Submission Number: 7478
Loading