Reciprocal Space Attention for Learning Long-Range Interactions

Published: 29 Oct 2025, Last Modified: 29 Oct 2025AI4Mat-NeurIPS-2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: reciprocal-space attention, long-range interactions, machine learning potentials, linear attention, graph neural networks
Abstract: Machine learning interatomic potentials (MLIPs) have revolutionized the modeling of materials and molecules by directly fitting to ab-initio data. However, while these models excel at capturing local and semi-local interactions, they often prove insufficient when an explicit and efficient treatment of long-range interactions are required. To address this limitation, we introduce Reciprocal-Space Attention (RSA), designed to capture long-range interactions in the Fourier domain. RSA can be seamlessly integrated with any existing local or semi-local MLIP framework. Our key contribution is mapping the linear-scaling attention mechanism into Fourier space. This technique allows us to effectively capture long-range interactions, such as electrostatics and dispersion, without requiring predefined charges or other explicit empirical assumptions. We demonstrate the effectiveness of our method through a diverse set of benchmarks, including the dimer binding curve, dispersion interactions in layered phosphorene exfoliation, and molecular dynamics simulation of water. Our results show that RSA successfully captures long-range interactions in various such chemical environments.
Submission Track: Paper Track (Full Paper)
Submission Category: AI-Guided Design
Institution Location: Pittsburgh, USA. Lemont, USA. Buffalo, USA. San Diego, USA
AI4Mat RLSF: Yes
Submission Number: 94
Loading