FastLSQ: Solving PDEs in One Shot via Fourier Features with Exact Analytical Derivatives

Published: 01 Mar 2026, Last Modified: 05 Mar 2026AI&PDE PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: One-shot, Fourier Features, PDE Solver, Exact Derivatives, Inverse Problems
TL;DR: FAST-LSQ achieves rapid, high-accuracy PDE solutions by combining sinusoidal random features with exact closed-form derivatives to eliminate the computational overhead of automatic differentiation.
Abstract: We present FastLSQ, a framework for PDE solving and inverse problems built on trigonometric random Fourier features with exact analytical derivatives. Trigonometric features admit closed-form derivatives of any order in $O(1)$, enabling graph-free operator assembly without autodiff. Linear PDEs: one least-squares call; nonlinear: Newton--Raphson reusing analytical assembly. On 17 PDEs (1--6D), FastLSQ achieves $10^{-7}$ in 0.07\,s (linear) and $10^{-8}$--$10^{-9}$ in $<$9\,s (nonlinear), orders of magnitude faster and more accurate than iterative PINNs. Analytical higher-order derivatives yield a differentiable digital twin; we demonstrate inverse problems (heat-source, coil recovery) and PDE discovery. Code: github.com/sulcantonin/FastLSQ; pip install fastlsq.
Journal Opt In: Yes, I want to participate in the IOP focus collection submission
Journal Corresponding Email: sulc.antonin@gmail.com
Journal Notes: While the manuscript was under review, we addressed several key issues and significantly expanded the work on top of addressing the review comments: - Engineering and Inverse Applications: We validated the method on real-world tasks, including 24-parameter heat source localization, magnetostatic coil recovery, and high-noise PDE discovery. - Production-Ready Framework: The framework is now fully accessible as a pip-installable Python package (fastlsq) and a public GitHub repository, complete with comprehensive tutorials, examples, and reproducibility scripts. - Automatic Differentiable Scale: We introduced a differentiable learnable scale for automated frequency tuning, along with matrix caching that enables 362x faster parametric sweeps. - Robustness and Stability: We detailed how 1/sqrt(N) normalization and Tikhonov regularization ensure convergence on stiff problems where alternative methods diverge. For the extended journal publication, we plan to address the following areas: -Developing effective preconditioning strategies for high-order PDEs, specifically to handle the matrix conditioning challenges associated with higher-order derivatives (n > 4). - Extending the framework to vector-valued systems, addressing the straightforward but linear growth in computational complexity. - Implementing domain decomposition techniques to better handle complex and irregular geometries. - Providing a rigorous theoretical analysis of approximation, discretization, and conditioning bounds. Timeline and Constraints: Solving all of these planned extensions is a question of weeks. There are no known constraints, as we are already actively working on these improvements.
Submission Number: 6
Loading