Simplicity is Key: An Unsupervised Pretraining Approach for Sparse Radio Channels

16 Sept 2025 (modified: 13 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Unsupervised learning, Feature extraction, Compressive sensing, Location awareness, Machine Learning, ICLR, Wireless networks, 5G mobile communication, Generative Pre-trainer
TL;DR: SpaRTran is an unsupervised pretraining method based on compressed sensing, that improves unsupervised representation learning for radio channels, while reducing the effort for pretraining and offering greater versatility than existing methods.
Abstract: We introduce Sparse pretrained Radio Transformer (SpaRTran), an unsupervised representation learning approach based on the concept of compressed sensing for wireless channels. SpaRTran learns embeddings that focus on the physical properties of radio propagation to allow for an efficient fine-tuning on radio-based downstream tasks. SpaRTran uses a sparse gated autoencoder that induces a simplicity bias to the learned representations, resembling the sparse nature of radio propagation. For signal reconstruction, it learns a dictionary that holds atomic features, which increases flexibility across signal waveforms and spatio-temporal signal patterns. Compared to the state of the art, SpaRTran cuts positioning error by up to 28% and increases top-1 codebook selection accuracy for beamforming by 26%pts. By pretraining models solely on individual channel measurements, it is system-agnostic and more versatile, allowing fine-tuning for diverse radio tasks and substantially reducing labeling costs.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 7665
Loading