Scaler Transfer: A Simple and Data-efficient Simulation-to-Real Transfer Scheme for Materials

Published: 20 Sept 2025, Last Modified: 05 Nov 2025AI4Mat-NeurIPS-2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Transfer learning, DFT, High-throughput experiment, Catalyst, Few-shot learning
TL;DR: Shareing a standard scaler dramatically improves few-shot learning performance.
Abstract: Data scarcity and domain heterogeneity impede simulation-to-real (sim2real) transfer in materials data. We present a simple, data-efficient recipe that couples a domain transformation with two methods: (i) scaler transfer, which shares standardization parameters fitted on transformed source data to robustly scale scarce target data; and (ii) fine-tuning, which pretrains a predictor on the transformed source and adapts it to the target. On the prediction of electrocatalytic activity with the Open Catalyst Experiment 2024 datasets, the proposed method consistently surpasses baselines and achieves $R^2 > 0.81$ at the best condition. Critically, the scaler transfer significantly improves the performance of few-shot learning, scoring $R^2=0.43$ compared to $-0.062$ for a baseline. This method is not only easy to implement but also model- and task-agnostic, extending the coverage of sim2real transfer in materials informatics.
Submission Track: Paper Track (Short Paper)
Submission Category: AI-Guided Design
Institution Location: {Tokyo, Japan}, {Kyoto, Japan}
AI4Mat Journal Track: Yes
AI4Mat RLSF: Yes
Submission Number: 85
Loading