Abstract: Theory-guided neural network recently has been used to solve partial differential equations. This method has received widespread attention due to its low data requirements and adherence to physical laws during the training process. However, the selection of the punishment coefficient for including physical laws as a penalty term in the loss function undoubtedly affects the performance of the model. In this paper, we propose a comprehensive theory-guided framework using a bilevel programming model that can adaptively adjust the hyperparameters of the loss function to further enhance the performance of the model. An enhanced water flow optimizer (EWFO) algorithm is applied to optimize upper-level variables in the framework. In this algorithm, an opposition-based learning technic is used in the initialization phase to boost the initial group quality; a nonlinear convergence factor is added to the laminar flow operator to upgrade the diversity of the group and expand the search range. The experiments show that competitive performance of the method in solving stochastic partial differential equations.
Loading