Pre-training, fine-tuning, and distillation (PFD): Automatically generating machine learning force fields from universal models

Published: 25 Mar 2026, Last Modified: 22 Apr 2026AI4X-AC 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Submission Type: I want my submission to be considered for poster only
Keywords: Machine learning force field, fine-tuning, distillation
TL;DR: An automatic workflow that turns universal force fields into fast, material-specific models with first-principles accuracy using minimal DFT calculations.
Confirmation Of Submission Requirements: I submit a previously published paper. It was published in an archival peer–reviewed venue on or after September 1st 2025, I specify the DOI in the field below, and I submit the camera-ready version of the paper.
DOI: https://doi.org/10.1103/sbz6-btz8
Submission Number: 356
Loading