CoNNect: Connectivity-Based Regularization for Structural Pruning of Neural Networks

Published: 15 Sept 2025, Last Modified: 15 Sept 2025Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Pruning encompasses a range of techniques aimed at increasing the sparsity of neural networks (NNs). These techniques can generally be framed as minimizing a loss function subject to an $L_0$ norm constraint. This paper introduces CoNNect, a novel differentiable regularizer for sparse NN training that ensures connectivity between input and output layers. We prove that CoNNect approximates $L_0$ regularization, while preserving essential network structure and preventing the emergence of fragmented or poorly connected subnetworks. Moreover, CoNNect is easily integrated within established structural pruning strategies. Numerical experiments demonstrate that CoNNect can improve classical pruning strategies and enhance state-of-the-art one-shot pruners, such as DepGraph and LLM-pruner.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: All edits are of minor nature, except for the inclusion of DWF results in Appendix D.3, as requested by one of the reviewers.
Code: https://github.com/cfn420/CoNNect
Supplementary Material: zip
Assigned Action Editor: ~Aaron_Klein1
Submission Number: 5095
Loading