Abstract: In recent years, the emergence of deep learning has brought Physics-Informed Neural Networks (PINNs) into the spotlight as a promising method for solving partial differential equations (PDEs). Despite the attention received by PINNs, their accuracy and convergence face significant challenges, particularly when dealing with complex equations containing multiple components. In PDEs, due to their inherent rigidity, distinct components may converge at varying rates, resulting in an unbalanced convergence process. This imbalance during optimization has the potential to yield suboptimal solutions. We introduce an importance-guided sequential training method to regulate the competition of different components in PDEs. The importance can be quantitatively defined through correlation analysis, enabling the formulation of an algorithm that systematically instructs neural networks to acquire insights from individual terms in accordance with their respective importance levels. To evaluate the performance of our proposed approach, we applied it to solve the Burgers equation and the Klein-Gordon equation. The results clearly demonstrate that our proposed approaches outperform the original model, showcasing the effectiveness of our internal weighting method in improving the accuracy and convergence of PINNs. Furthermore, our research serves as a stepping stone for exploring and harnessing the power of correlation analysis in the realm of PINNs.
Loading