A variational framework for local learning with probabilistic latent representations

Published: 05 Mar 2024, Last Modified: 12 May 2024PML4LRS PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: alternative to backprop, locking problem, probabilistic models, weight transport problem
TL;DR: We augment DNNs with feedback networks for local training. Forward and feedback networks are trained and operate in parallel with different sets of weights, addressing the problems of weight transport and locking.
Abstract: We propose a new method for distributed learning by dividing a deep neural network into blocks and introducing a feedback network that propagates information from the targets backward to provide auxiliary local losses. Forward and backward propagation can operate in parallel and with different sets of weights, addressing the problems of locking and weight transport. Our approach derives from a statistical interpretation of training that treats output activations of network blocks as parameters of probability distributions. The resulting learning framework uses these parameters to evaluate the agreement between forward and backward information. Error backpropagation is then performed locally within each block, leading to ``block-local'' learning. We present preliminary results on a variety of tasks and architectures, demonstrating state-of-the-art performance using block-local learning. These results provide a new principled framework for distributed asynchronous learning.
Submission Number: 67
Loading