SepONet: Efficient Large-Scale Physics-Informed Operator Learning

Published: 30 Sept 2024, Last Modified: 30 Oct 2024D3S3 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Partial differential equations, separation of variables, scientific machine learning, operator learning, deep operator network (DeepONet), physics-informed neural network
TL;DR: The separable operator network (SepONet) is a new framework for extreme-scale physics-informed operator learning, vastly surpassing conventional DeepONet methods in training complexity while obtaining similar accuracy
Abstract: We introduce Separable Operator Networks (SepONet), a novel framework that significantly enhances the efficiency of physics-informed operator learning. SepONet uses independent trunk networks to learn basis functions separately for different coordinate axes, enabling faster and more memory-efficient training via forward-mode automatic differentiation. We provide a universal approximation theorem for SepONet proving that it generalizes to arbitrary operator learning problems, and then validate its performance through comprehensive benchmarking against physics-informed DeepONet. Our results demonstrate SepONet's superior performance across various nonlinear and inseparable PDEs, with SepONet's advantages increasing with problem complexity, dimension, and scale. Open source code is available at https://github.com/HewlettPackard/separable-operator-networks.
Submission Number: 15
Loading