Synthetic Gradient Methods with Virtual Forward-Backward NetworksDownload PDF

19 Apr 2024 (modified: 18 Nov 2017)ICLR 2017 workshop submissionReaders: Everyone
Abstract: The concept of synthetic gradient introduced by Jaderberg et al. (2016) provides an avant-garde framework for asynchronous learning of neural network. Their model, however, has a weakness in its construction, because the structure of their synthetic gradient has little relation to the objective function of the target task. In this paper we introduce virtual forward-backward networks (VFBN). VFBN is a model that produces synthetic gradient whose structure is analogous to the actual gradient of the objective function. VFBN is the first of its kind that succeeds in decoupling deep networks like ResNet-110 (He et al., 2016) without compromising its performance.
Conflicts: preferred.jp, atr.jp, kyoto-u.ac.jp, ritsumei.ac.jp
Keywords: Deep learning, Optimization
4 Replies

Loading