Synthetic Gradient Methods with Virtual Forward-Backward NetworksDownload PDF

29 Apr 2025 (modified: 18 Nov 2017)ICLR 2017Readers: Everyone
Abstract: The concept of synthetic gradient introduced by Jaderberg et al. (2016) provides an avant-garde framework for asynchronous learning of neural network. Their model, however, has a weakness in its construction, because the structure of their synthetic gradient has little relation to the objective function of the target task. In this paper we introduce virtual forward-backward networks (VFBN). VFBN is a model that produces synthetic gradient whose structure is analogous to the actual gradient of the objective function. VFBN is the first of its kind that succeeds in decoupling deep networks like ResNet-110 (He et al., 2016) without compromising its performance.
Conflicts: preferred.jp, atr.jp, kyoto-u.ac.jp, ritsumei.ac.jp
Keywords: Deep learning, Optimization
4 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview