Selective sparsity in Fourier Neural Operator Networks to accelerate Partial Differential Equation solving

ICLR 2026 Conference Submission15342 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Partial Differential Equations, Fourier Neural Operator, Diagonal Networks, Sparsity
TL;DR: Using implicitly regularized Diagonal Networks, we enforce a spectral sparsity constraint in Fourier Neural Operators for accelerated PDE learning.
Abstract: Fourier Neural Operators (FNOs) have emerged as a powerful framework for learning solution operators of partial differential equations (PDEs). However, reliance on dense spectral representations leads to high computational cost and limited interpretability. We propose a Spectrally-Sparsified Fourier Neural Operator (SS-FNO) that achieves state-of-the-art accuracy while substantially reducing spectral complexity. Our approach augments each FNO layer with a lightweight sparse selector with a diagonal gating mechanism whose implicit bias under stochastic gradient descent drives many frequency weights toward zero. This induces automatic pruning of uninformative Fourier modes, yielding a compact operator representation that is both efficient and interpretable. We validate SS-FNO on benchmark PDEs, including the Burgers’ equation, Darcy flow, and Navier–Stokes equations. Across all cases, SS-FNO matches or exceeds the predictive accuracy of standard FNOs while reducing the number of active frequency modes, reducing the memory footprint and the computation cost. By demonstrating that accurate operator learning does not require dense spectral representations, our work highlights spectral sparsity as a principled path toward scalable and interpretable neural operator models.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 15342
Loading