Simplicity is Key: An Unsupervised Pretraining Approach for Sparse Radio Channels

Published: 06 Jun 2025, Last Modified: 06 Jun 2025ICML Workshop on ML4WirelessEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Unsupervised learning, Feature extraction, Compressive sensing, Location awareness, Machine Learning, ICML, Wireless networks, 5G mobile communication, Generative Pre-trainer
TL;DR: SpaRTran is an unsupervised pretraining method based on compressed sensing, that improves unsupervised representation learning for radio channels, while reducing the effort for pretraining and offering greater versatility than existing methods.
Abstract: We introduce the Sparse pretrained Radio Transformer (SpaRTran), an unsupervised rep- resentation learning approach based on the con- cept of compressed sensing for radio channels. Our approach learns embeddings that focus on the physical properties of radio propagation, to create the optimal basis for fine-tuning on radio- based downstream tasks. SpaRTran uses a sparse gated autoencoder that induces a simplicity bias to the learned representations, resembling the sparse nature of radio propagation. For signal recon- struction, it learns a dictionary that holds atomic features, which increases flexibility across signal waveforms and spatiotemporal signal pattern. Our experiments show that SpaRTran reduces er- rors by up to 85 % compared to state-of-the-art methods when fine-tuned on radio fingerprinting, a challenging downstream task. In addition, our method requires less pretraining effort and offers greater flexibility, as we train it solely on individ- ual radio signals. SpaRTran serves as an excel- lent base model that can be fine-tuned for various radio-based downstream tasks, effectively reduc- ing the cost for labeling. And it is significantly more versatile than existing methods and demon- strates superior generalization.
Submission Number: 20
Loading