Chebyshev-Augmented One-Shot Transfer Learning for PINNs on Nonlinear Differential Equations

Published: 01 Mar 2026, Last Modified: 05 Mar 2026AI&PDE OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Physics-Informed Neural Networks (PINNs), Nonlinear differential equations, Perturbation Theory, Transfer learning, Chebyshev Approximation.
TL;DR: We propose a perturbative PINN framework that extends one-shot transfer learning to nonlinear ODEs/PDEs using Chebyshev apprximation, enabling closed-form output-layer updates for rapid and accurate many-query adaptation without retraining.
Abstract: Physics-Informed Neural Networks (PINNs) offer a flexible paradigm for solving differential equations by embedding governing laws into the training objective. A persistent limitation is instance specificity: standard PINNs typically require retraining for each new forcing term, boundary/initial condition, or parameter setting. One-shot transfer learning (OTL) addresses this bottleneck for linear operators by freezing a pretrained latent representation and computing optimal output weights in closed form, but for nonlinear problems closed-form adaptation is generally unavailable because the loss is nonconvex in the output layer. In this paper we substantially broaden the class of nonlinearities amenable to one-shot PINN transfer by combining OTL with Chebyshev polynomial surrogates. We approximate general smooth nonlinear terms by truncated Chebyshev expansions over a prescribed solution range, yielding a polynomial nonlinearity that can be handled by a perturbative decomposition into linear subproblems. A multi-head PINN learns a reusable latent space associated with the dominant linear operator; at test time, solutions to new instances are obtained via a sequence of closed-form linear solves in the output layer, without retraining the network body. We provide a unified derivation of the framework for ODEs and PDEs and demonstrate accuracy and fast online adaptation on nonlinear benchmarks, including non-polynomial and singular ODE nonlinearities as well as a reaction–diffusion PDE with saturating kinetics, demonstrating the method’s utility in many-query regimes.
Journal Opt In: Yes, I want to participate in the IOP focus collection submission
Journal Notes: Planned extensions include additional benchmarks (more realistic and challenging PDEs), stronger comparative baselines, and systematic ablations over key hyperparameters.
Submission Number: 148
Loading