Abstract: Operator eigenvalue problems play a critical role in various scientific fields and engineering applications, yet numerical methods are hindered by the curse of dimensionality. Recent deep learning methods provide an efficient approach to address this challenge by iterative updating neural networks.
These methods' performance relies heavily on the spectral distribution of the given operator: larger gaps between the operator's eigenvalues will improve precision, thus tailored spectral transformations that leverage the spectral distribution can enhance their performance. Based on this observation, we propose the **S**pectral **T**ransformation **Net**work (**STNet**).
During each iteration, STNet uses approximate eigenvalues and eigenfunctions to perform spectral transformations on the original operator, turning it into an equivalent but easier problem.
Specifically, we employ deflation projection to exclude the subspace corresponding to already solved eigenfunctions, thereby reducing the search space and avoiding converging to existing eigenfunctions.
Additionally, our filter transform magnifies eigenvalues in the desired region and suppresses those outside, further improving performance.
Extensive experiments demonstrate that STNet consistently outperforms existing learning-based methods, achieving state-of-the-art performance in accuracy.
Primary Area: Applications->Chemistry, Physics, and Earth Sciences
Keywords: AI for Science, Operator Eigenvalue Problem, Scientific Computing
Submission Number: 14677
Loading