Simulation-Based Pretraining and Domain Adaptation for Astronomical Time Series Tasks with Minimal Labeled Data

Published: 09 Jun 2025, Last Modified: 03 Jul 2025FMSD @ ICML 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Transfer Learning, Foundation Models, Classifiers, Time Series, Astrophysics
TL;DR: We build foundation models for astronomy by pretraining on physics-informed simulations, achieving significant performance improvements on real data tasks and enabling zero-shot transfer between different telescopes like ZTF and LSST.
Abstract: Astronomical time-series analysis faces a critical limitation: the scarcity of labeled observational data. We present a pre-training approach that leverages physics-informed simulations, significantly reducing the need for labeled examples from real observations. Using classifier-based architectures enhanced with contrastive and adversarial objectives, we create domain-agnostic models that recognize similar astronomical phenomena across different instrumental contexts and learn generalizable representations that transfer effectively to downstream tasks. Our models are trained on simulated astronomical transients from multiple telescope surveys (ZTF and LSST), and demonstrate substantial performance improvements over previous methods in classification, redshift estimation, and anomaly detection tasks when fine-tuned with minimal real data. Remarkably, our models exhibit effective zero-shot transfer capabilities, achieving comparable performance on future telescope (LSST) simulations when trained solely on existing telescope (ZTF) data. Furthermore, they generalize to entirely different astronomical phenomena (namely variable stars from NASA's \textit{Kepler} telescope) despite being trained on transient events, demonstrating cross-domain capabilities.
Submission Number: 74
Loading