Learning incomplete factorization preconditioners for GMRES

Published: 06 Nov 2024, Last Modified: 06 Jan 2025NLDL 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph neural networks, preconditioner, data-driven optimization
Abstract: Incomplete LU factorizations of sparse matrices are widely used as preconditioners in Krylov subspace methods to speed up solving linear systems. Unfortunately, computing the preconditioner itself can be time-consuming and sensitive to hyper-parameters. Instead, we replace the hand-engineered algorithm with a graph neural network that is trained to approximate the matrix factorization directly. To apply the output of the neural network as a preconditioner, we propose an output activation function that guarantees that the predicted factorization is invertible. Further, applying a graph neural network architecture allows us to ensure that the output itself is sparse which is desirable from a computational standpoint. We theoretically analyze and empirically evaluate different loss functions to train the learned preconditioners and show their effectiveness in decreasing the number of GMRES iterations and improving the spectral properties on synthetic data. The code is available at https://github.com/paulhausner/neural-incomplete-factorization.
Git: https://github.com/paulhausner/neural-incomplete-factorization
Submission Number: 30
Loading