One-Shot Neural Network Pruning via Spectral Graph Sparsification

Published: 18 Jun 2023, Last Modified: 02 Jul 2023TAGML2023 PosterEveryoneRevisions
Keywords: spectral graph theory, spectral sparsification, neural network pruning
TL;DR: Inspired by techniques in the field of spectral graph theory, we propose to use spectral graph sparsification for neural network pruning
Abstract: Neural network pruning has gained significant attention for its potential to reduce computational resources required for training and inference. A large body of research has shown that networks can be pruned both after training and at initialisation, while maintaining competitive accuracy compared to dense networks. However, current methods rely on iteratively pruning or repairing the network to avoid over-pruning and layer collapse. Recent work has found that by treating neural networks as a sequence of bipartite graphs, pruning can be studied through the lens of spectral graph theory. Therefore, in this work, we propose a novel pruning approach using spectral sparsification, which aims to preserve meaningful properties of a dense graph with a sparse subgraph, by preserving the spectrum of the dense graph's adjacency matrix. We empirically validate and investigate our method, and show that one-shot pruning using spectral sparsification preserves performance at higher levels of sparsity compared to its one-shot counterparts. Additionally, we theoretically analyse our method with respect to local and global connectivity.
Type Of Submission: Proceedings Track (8 pages)
Submission Number: 48