In the realm of reduced order modeling, the Proper Orthogonal Decomposition (POD) has established itself as a widely adopted technique for efficiently handling parametric partial differential equations. This approach exploits principles of linear algebra to extract, from a collection of high-fidelity numerical solutions, an optimized reduced space capable of linearly representing the input data. This paper aims to introduce an innovative alternative to replicate the capabilities of POD by harnessing the power of neural networks, thereby overcoming the constraint of exclusively working with solutions confined to the same topological space.
Our method centers around the utilization of the DeepONet architecture, which is applied and minimally modified to emulate the POD spatial-temporal (or parametric) decomposition. This novel adaptation enables the creation of a continuous representation of spatial modes. Although the accuracy gap between neural networks and linear algebraic tools is still evident, this architecture exhibits a distinct advantage: it can accept solutions generated through different discretization schemes, contrary to the conventional POD approach.
Furthermore, our approach allows various enhancements and variants developed to augment the capabilities of POD. These can be seamlessly integrated into the architecture, offering a versatile and adaptable framework known as PODNet.
To validate its effectiveness, we apply it to two distinct test cases: a simple 1D trigonometric problem and a more complex 2-dimensional Graetz problem. In doing so, we conduct a comprehensive comparison between our proposed methodology and established approaches, shedding light on the potential advantages and trade-offs inherent to this innovative fusion of neural networks and traditional reduced order modeling techniques.