GIT-Net: Generalized Integral Transform for Operator Learning

Published: 05 Dec 2023, Last Modified: 05 Dec 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: This article introduces GIT-Net, a deep neural network architecture for approximating Partial Differential Equation (PDE) operators, inspired by integral transform operators. GIT-NET harnesses the fact that common differential operators commonly used for defining PDEs can often be represented parsimoniously when expressed in specialized functional bases (e.g., Fourier basis). Unlike rigid integral transforms, GIT-Net parametrizes adaptive generalized integral transforms with deep neural networks. When compared to several recently proposed alternatives, GIT-Net's computational and memory requirements scale gracefully with mesh discretizations, facilitating its application to PDE problems on complex geometries. Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems. This stands in contrast to existing neural network operators, which typically excel in just one of these areas.
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: Then changes lie in the following: 1. Add Figure 1 and Figure 2 to show the architecture of GIT-Net and the dimensions of tensors. 2. Add simulations of comparison in Appendix A between with-grid cases and without-grid cases of FNO. 3. Add simulations of comparison in Appendix B between GIT-Net block $\sigma(T \alpha+ \mathcal{K}\alpha)$ and its variance $T\alpha + \sigma (\mathcal{K}\alpha)$. 4. Add histograms in Section 5.2 to visually show the test error distributions of all neural network operators. 5. Add text and reference to explain SVD. 6. Improved writing.
Assigned Action Editor: ~Joan_Bruna1
Submission Number: 727