Keywords: digital twin, contrastive pretraining, distribution shift, CSI prediction
TL;DR: contrastive learning framework for 6G digital twins that enables robust CSI prediction under distribution shifts, outperforming prior baselines by up to 5.4 dB
Abstract: Despite recent progress in deep learning for wireless systems, current *digital twin–based* CSI prediction models often struggle with two core challenges: poor generalization under network distribution shifts and high retraining costs. These issues stem from heavy reliance on synthetic data, overparameterized model designs, and rigid update mechanisms. We introduce **ConTwin**, a contrastive learning framework tailored for 6G digital twin environments, that enables efficient and generalizable CSI prediction. ConTwin leverages digital twin simulations to construct domain-aware positive and hard negative pairs, enabling the model to learn representations that are robust across varying user mobility, LOS/NLOS scenarios, and antenna configurations. Experiments on 3GPP-aligned benchmarks demonstrate that ConTwin improves CSI prediction NMSE by up to 5.4 dB under distribution shift and by up to 1.2 dB in-distribution, outperforming leading baselines such as SwinCFNet and CNN-based models. These results highlight ConTwin’s potential as a foundational component for robust, data-driven 6G digital twin networks.
Submission Number: 80
Loading