A Constellation-Aware Transformer for Nonlinear Channel Equalization

10 Sept 2025 (modified: 30 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Channel Equalization, Transformer, Physical Layer Communications, Signal Processing
TL;DR: We introduce Constellation-Aware Transformer (CAT), a new transformer-based model for decoding communication signals over unknown, noisy channels, especially when very few known "pilot" signals are available.
Abstract: Decoding signals over unknown channels with minimal pilot overhead is a critical challenge in communications. Existing deep learning approaches struggle to model long-range temporal dependencies. Conversely, off-the-shelf Transformers, while powerful sequence models, are domain-agnostic and inefficiently learn the channel's physical properties from scarce data. We introduce the *Constellation-Aware Transformer* (CAT), a novel architecture that integrates fundamental communication principles into the Transformer model. CAT is composed of a stack of custom *TransFIRmer* blocks, which redesign the standard Transformer to be constellation-aware. Each block facilitates deep interaction between the received signals and the ideal constellation geometry via a specialized attention mechanism. Furthermore, it replaces the standard feed-forward network with a two-stream architecture: a bidirectional Finite Impulse Response (FIR)-inspired filter processes the signal representations for robust deconvolution, while a parallel MLP refines the constellation representations. In the challenging semi-supervised setting, CAT achieves superior performance across multiple noisy channels, significantly outperforming other baselines, with using fewer pilot signals.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 3679
Loading