SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks at the Edge

Published: 24 Apr 2023, Last Modified: 15 Jun 2023ICML 2023 OralPosterEveryoneRevisions
Abstract: We provide an efficient implementation of the backpropagation algorithm, specialized to the case where the weights of the neural network being trained are _sparse_. Our algorithm is general, as it applies to arbitrary (unstructured) sparsity and common layer types (e.g., convolutional or linear). We provide a fast vectorized implementation on commodity CPUs, and show that it can yield speedups in end-to-end runtime experiments, both in transfer learning using already-sparsified networks, and in training sparse networks from scratch. Thus, our results provide the first support for sparse training on commodity hardware.
Submission Number: 4491
Loading