Abstract: Pruning encompasses a range of techniques aimed at increasing the sparsity of neural networks (NNs). These techniques can generally be framed as minimizing a loss function subject to an $L_0$ norm constraint. This paper introduces CoNNect, a novel differentiable regularizer for sparse NN training that ensures connectivity between input and output layers. We prove that CoNNect approximates $L_0$ regularization, while preserving essential network structure, preventing the emergence of fragmented or poorly connected subnetworks. Moreover, CoNNect is easily integrated with established structural pruning strategies. Numerical experiments demonstrate that CoNNect can improve classical pruning strategies and enhance state-of-the-art one-shot pruners, such as DepGraph and LLM-pruner.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Aaron_Klein1
Submission Number: 5095
Loading