Physics-informed learning under mixing: How physical knowledge speeds up learning

Published: 26 Jan 2026, Last Modified: 11 Feb 2026ICLR 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: learning with dependent data, physics-informed machine learning, convergence rates, complexity-dependent bounds
TL;DR: We prove that adding correct prior domain knowledge to nonparametric learning with dependent data speeds up learning
Abstract: A major challenge in physics-informed machine learning is to understand how the incorporation of prior domain knowledge affects learning rates when data are dependent. Focusing on empirical risk minimization with physics-informed regularization, we derive complexity-dependent bounds on the excess risk in probability and in expectation. We prove that, when the physical prior information is aligned, the learning rate improves from the (slow) Sobolev minimax rate to the (fast) optimal i.i.d. one without any sample-size deflation due to data dependence.
Supplementary Material: zip
Primary Area: learning theory
Submission Number: 11838
Loading