Keywords: Fourier Neural Operator, Operator Learning, Partial Differential Equations
TL;DR: We introduce a higher-order spectral mixer that adds explicit m-linear mode interactions to FNO, boosting nonlinear PDE accuracy with a negligible additional cost.
Abstract: Neural operators provide resolution-equivariant deep learning models for learning mappings between function spaces. Among them, the Fourier Neural Operator (FNO) is particularly effective: its spectral convolution combines a low-dimensional Fourier representation with strong empirical performance, enabling generalization across resolutions. While this design aligns with the structure of linear PDEs, where Fourier modes evolve independently, nonlinear PDEs exhibit structured interactions between modes governed by polynomial nonlinearities. To capture this inductive bias, we introduce the **Higher-Order Spectral Convolution**, a spectral mixer that extends FNO from diagonal modulation to explicit $n$-linear mode mixing aligned with nonlinear PDE dynamics. Across benchmarks, including Burgers and Navier-Stokes equations, our method consistently improves accuracy in nonlinear regimes, achieving lower error while retaining the efficiency of FFT-based architectures.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 19617
Loading