FORK: First-Order Relational Knowledge Distillation for Machine Learning Interatomic Potentials

NeurIPS 2025 Workshop AI4Mat Submission49 Authors

Published: 20 Sept 2025, Last Modified: 01 Dec 2025AI4Mat-NeurIPS-2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Knowledge Distillation, Machine Learning Interatomic Potentials, Relational Knowledge, Contrastive Learning, Graph Neural Networks
TL;DR: We introduce FORK, a relational contrastive knowledge distillation method that trains small, fast students to mimic the first-order interatomic interactions of larger, slower teachers.
Abstract: State-of-the-art equivariant Graph Neural Networks (GNNs) achieve quantum-level accuracy for molecular simulations but remain computationally prohibitive for large-scale applications. Knowledge distillation (KD) presents a solution by compressing these GNN-based Machine Learning Interatomic Potentials (MLIPs) into efficient models, yet existing distillation methods fail to capture the physics. Current KD approaches rely on simplistic atom-wise feature matching, overlooking the core physical principle of interatomic interactions that define the potential energy surface (PES). We introduce FORK, **F**irst-**O**rder **R**elational **K**nowledge Distillation, a framework that distills relational knowledge from pretrained GNNs by modeling each interatomic interaction as a relational vector. Through a contrastive objective, FORK guides compact student models to preserve the geometric structure of the teacher's learned PES. On the OC20 and SPICE benchmarks, our FORK-trained student outperforms baselines in energy and force prediction, achieving faithful physical knowledge transfer at a fraction of the computational cost. In a practical high-throughput catalyst screening application, the distilled model achieves a 11.9× acceleration while preserving chemical coherency, validating its efficacy for accelerating large-scale materials discovery.
Submission Track: Paper Track (Short Paper)
Submission Category: AI-Guided Design
AI4Mat Journal Track: Yes
AI4Mat RLSF: Yes
Submission Number: 49
Loading