Merging Memory and Space: A State Space Neural Operator

TMLR Paper5935 Authors

19 Sept 2025 (modified: 01 Oct 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We propose the \textit{State Space Neural Operator} (SS-NO), a compact architecture for learning solution operators of time-dependent partial differential equations (PDEs). Our formulation extends structured state space models (SSMs) to joint spatiotemporal modeling, introducing two key mechanisms: \textit{adaptive damping}, which stabilizes learning by localizing receptive fields, and \textit{learnable frequency modulation}, which enables data-driven spectral selection. These components provide a unified framework for capturing long-range dependencies with parameter efficiency. Theoretically, we establish connections between SSMs and neural operators, proving a universality theorem for convolutional architectures with full field-of-view. Empirically, SS-NO achieves strong performance across diverse PDE benchmarks—including 1D Burgers' and Kuramoto–Sivashinsky equations, and 2D Navier–Stokes and compressible Euler flows—while using significantly fewer parameters than competing approaches. Our results demonstrate that state space modeling provides an effective foundation for efficient and accurate neural operator learning.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Jean_Kossaifi1
Submission Number: 5935
Loading