Deep Operator Learning Lessens the Curse of Dimensionality for PDEs

Published: 27 Sept 2023, Last Modified: 16 Oct 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Deep neural networks (DNNs) have achieved remarkable success in numerous domains, and their application to PDE-related problems has been rapidly advancing. This paper provides an estimate for the generalization error of learning Lipschitz operators over Banach spaces using DNNs with applications to various PDE solution operators. The goal is to specify DNN width, depth, and the number of training samples needed to guarantee a certain testing error. Under mild assumptions on data distributions or operator structures, our analysis shows that deep operator learning can have a relaxed dependence on the discretization resolution of PDEs and, hence, lessen the curse of dimensionality in many PDE-related problems including elliptic equations, parabolic equations, and Burgers equations. Our results are also applied to give insights about discretization-invariant in operator learning.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We have edited the contribution to emphasize more on this tradeoff of accuracy for generalization. We also added one sentence in remark 3, and proof of theorem 1 to explicitly point out that we use triangle inequality to circumvent the inner-product structure. Notation: to be consistent with the convention notation, we have changed it to big-\Omega notations.
Assigned Action Editor: ~Yaoliang_Yu1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1207
Loading