Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Communication Algorithms via Deep Learning
Nov 07, 2017 (modified: Nov 07, 2017)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:Coding theory is a central discipline underpinning wireline and wireless modems that are the workhorses of the information age. Progress in coding theory is largely driven by individual human ingenuity with sporadic breakthroughs over the past century. In this paper we study whether it is possible to automate the discovery of coding and decoding algorithms via deep learning. We study a family of sequential codes parametrized by recurrent neural network (RNN) architectures. We show that creatively designed and trained RNN architectures can decode well known sequential codes such as the convolutional and turbo codes with close to optimal performance on the additive white Gaussian noise (AWGN) channel, which itself is achieved by breakthrough algorithms of our times (Viterbi and BCJR decoders, representing dynamic programing and forward-backward algorithms). We also show strong generalizations: we train at a specific signal to noise ratio and block length but test at a wide range of these quantities, and also demonstrate robustness and adaptivity to deviations from the AWGN setting. Finally, we use the RNN architectures to design new nonlinear codes that represent a major progress in the long standing open problem of communicating reliably over the AWGN channel with noisy output feedback.
TL;DR:We show that creatively designed and trained RNN architectures can decode well known sequential codes and achieve close to optimal performances.
Keywords:coding theory, recurrent neural network, communication
Enter your feedback below and we'll get back to you as soon as possible.