Task-aware Distributed Source Coding under Dynamic Bandwidth

Published: 15 Apr 2024, Last Modified: 23 Apr 2024Learn to Compress @ ISIT 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Distributed Compression, Task-Aware Compression, Semantic Communication, Deep Learning
TL;DR: We leverage neural networks to efficiently compress distributed and correlated sources to acheive optimal task performance under varying and limited bandwidth.
Abstract: Efficient compression of correlated data is vital for reducing communication overload in multisensor networks, where sensors independently compress data for transmission to a central node. To optimize performance, it’s crucial for the compressor to learn only task-relevant features, considering the fluctuating bandwidth availability. Our work introduces a novel distributed compression framework, Neural Distributed Principal Component Analysis (NDPCA), comprising independent encoders and a joint decoder. NDPCA adapts flexibly to varying bandwidth, reducing computational and storage demands by employing a single model. By learning low-rank task representations and efficiently allocating bandwidth among sensors, NDPCA achieves a balanced trade-off between performance and bandwidth utilization. Experimental results demonstrate NDPCA’s effectiveness, improving success rates in multi-view robotic arm manipulation by 9% and enhancing object detection accuracy in satellite imagery tasks by 14% compared to an autoencoder with uniform bandwidth allocation.
Submission Number: 11
Loading