Keywords: Mixture-of-experts, Multi-task learning, Semantic Communication, CSI-free multiple access
TL;DR: We develop an MoE-based semantic communication framework that supports multi-task learning in uncoordinated multiple access scenarios.
Abstract: This work investigates the use of a Mixture-of-Experts (MoE) framework for multi-task semantic communications (MT-SemCom) in scenarios where multiple devices simultaneously transmit multi-task semantic features without channel state information (CSI). Two key design components are proposed. First, each device applies a random linear transformation to its data, which preserve the underlying semantic features while enabling reliable reconstruction provided that the resulting subspaces are approximately orthogonal. Second, to handle the case where subspaces partially overlap and inter-device interference arises, the receiver employs an MoE-based architecture with an additional multi-task expert trained to be robust against such interference. These complementary designs jointly deliver substantial gains for MT-SemCom, as validated through two-device simulations on a mixed MNIST and FMNIST dataset under CSI-free multiple access.
Submission Number: 47
Loading