Low Rank Training of Deep Neural Networks for Emerging Memory TechnologyDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We use Kronecker sum approximations for low-rank training to address challenges in training neural networks on edge devices that utilize emerging memory technologies.
Abstract: The recent success of neural networks for solving difficult decision tasks has incentivized incorporating smart decision making "at the edge." However, this work has traditionally focused on neural network inference, rather than training, due to memory and compute limitations, especially in emerging non-volatile memory systems, where writes are energetically costly and reduce lifespan. Yet, the ability to train at the edge is becoming increasingly important as it enables applications such as real-time adaptability to device drift and environmental variation, user customization, and federated learning across devices. In this work, we address four key challenges for training on edge devices with non-volatile memory: low weight update density, weight quantization, low auxiliary memory, and online learning. We present a low-rank training scheme that addresses these four challenges while maintaining computational efficiency. We then demonstrate the technique on a representative convolutional neural network across several adaptation problems, where it out-performs standard SGD both in accuracy and in number of weight updates.
Code: https://anonymous.4open.science/r/77ebbbb0-45c7-4944-a594-3dd742b7ca07/
Keywords: low rank training, kronecker sum, emerging memory, non-volatile memory, rram, reram, federated learning
Original Pdf: pdf
10 Replies

Loading