Associated Learning: an Alternative to End-to-End Backpropagation that Works on CNN, RNN, and TransformerDownload PDF

29 Sept 2021, 00:31 (edited 10 Feb 2022)ICLR 2022 PosterReaders: Everyone
  • Keywords: pipeline training, parallel training, backpropagation, associated learning
  • Abstract: This paper studies Associate Learning (AL), an alternative methodology to the end-to-end backpropagation (BP). We introduce the workflow to convert a neural network into a proper structure such that AL can be used to learn the weights for various types of neural networks. We compared AL and BP on some of the most successful types of neural networks -- Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Transformer. Experimental results show that AL consistently outperforms BP on various open datasets. We discuss possible reasons for AL's success and its limitations.
  • One-sentence Summary: This paper studies Associate Learning, an alternative methodology to the end-to-end backpropagation
  • Supplementary Material: zip
19 Replies

Loading