Multi-Task Transformer Receiver for OFDM Channel Estimation and Symbol Detection

Published: 24 Sept 2025, Last Modified: 18 Nov 2025AI4NextG @ NeurIPS 25 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: transformer, in-context learning, wireless communication
Abstract: By leveraging in-context learning (ICL), pretrained Transformers adapt to unseen tasks from example prompts without task-specific fine-tuning. This adaptability has motivated their use in wireless communications, where ICL-based Transformers have shown strong performance on symbol detection. However, deploying a Transformer solely for symbol detection is less cost-effective. Can we design a multi-task Transformer that, without significantly increasing inference overhead, unifies additional modules of the wireless communication receiver within a single model? In this work, we propose a multi-task ICL Transformer that treats pilots within a coherence block as prompts, and jointly outputs the detected data symbol and an explicit channel-frequency response (CFR). Empirically, we find that activating the model’s multi-task capability improves both training efficiency and receiver performance under the same model size, compared to an ICL-based Transformer performing symbol detection alone.
Submission Number: 37
Loading