The conjugate kernel for efficient training of physics-informed deep operator networks

Published: 03 Mar 2024, Last Modified: 04 May 2024AI4DiffEqtnsInSci @ ICLR 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: physics-informed machine learning, operator learning, neural tangent kernel
TL;DR: The conjugate kernel offers similar accuracy to the neural tangent kernel when training physics-informed DeepONets, at a significant cost savings.
Abstract: Recent work has shown that the empirical Neural Tangent Kernel (NTK) can significantly improve the training of physics-informed Deep Operator Networks (DeepONets). The NTK, however, is costly to calculate, greatly increasing the cost of training such systems. In this paper, we study the performance of the empirical Conjugate Kernel (CK) for physics-informed DeepONets, an efficient approximation to the NTK that has been observed to yield similar results. For physics-informed DeepONets, we show that the CK performance is comparable to the NTK, while significantly reducing the time complexity for training DeepONets with the NTK.
Submission Number: 19
Loading