Positional Differential Encoding for Distributed Learning
Abstract: A growing amount of available data and computational power makes training neural networks over a network of devices, and distribution optimization in general, more realizable. As a consequence, efficient communication becomes more important and can often be the bottleneck of such algorithms. Traditional data compression techniques for these scenarios consider the correlation of the data values over time: although the transmitted data itself may not be sparse, the changes within the data are increasingly sparse as the algorithm converges. In this work, we propose an encoding scheme for discrete symbols that adapts a source encoder to consider the positional correlations through time. We provide a bound on the bit rate yielded by our scheme and verify our results through experiments.
Loading