Wef-GNN: A Generalizable Graph Neural Network for Crystalline Material Property Prediction

ICLR 2026 Conference Submission21065 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Network, Density Functional Theory, Material Science, AI for Science
TL;DR: A GNN architecture and training scheme that enables efficient and generalizable crystalline material property prediction
Abstract: Graph neural networks (GNNs) have shown great promise for predicting properties of crystalline solids. However, existing models struggle to generalize across crystals of varying sizes, and there is a lack of high-fidelity $\textit{ab initio}$ training data. Here, Wef-GNN addresses the problem of generalizability by introducing a multi-head temporal attention mechanism in the graph update function and a crystalline graph representation scheme that is more size-agnostic compared to the traditional primitive unit cell-based graph representation. Further, it was found that a single Wef-GNN layer can be recycled for all graph convolution steps without considerable loss in accuracy; this leads to deep receptive fields without additional parameters. Wef-GNN outperforms all prior models in a standard band gap prediction benchmark while having much fewer parameters. To address the challenge of high quality $\textit{ab initio}$ training data, a high-fidelity dataset was curated by performing 10,522 high-accuracy Density Functional Theory (DFT) calculations. Wef-GNN was pre-trained on a standard large dataset of lower-accuracy DFT calculations then fine-tuned with the high-accuracy DFT dataset. The resulting model matches experimental band-gap values much better than other GNNs, and even outperforms the underlying low-accuracy DFT calculations.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 21065
Loading