Tiny Graph Convolutional Networks with Topologically Consistent Magnitude Pruning

Published: 28 Oct 2023, Last Modified: 30 Nov 2023WANT@NeurIPS 2023 PosterEveryoneRevisionsBibTeX
Keywords: Tiny Graph Convolutional Networks, Pruning, Topological Consistency, Lightweight Models
Abstract: Magnitude pruning is one of the mainstream methods in lightweight architecture design whose goal is to extract subnetworks with the largest weight connections. This method is known to be successful, but under very high pruning regimes, it suffers from topological inconsistency which renders the extracted subnetworks disconnected, and this hinders their generalization ability. In this paper, we devise a novel end-to-end Topologically Consistent Magnitude Pruning (TCMP) method that allows extracting subnetworks while guaranteeing their topological consistency. The latter ensures that only accessible and co-accessible --- impactful --- connections are kept in the resulting lightweight architectures. Our solution is based on a novel reparametrization and two supervisory bi-directional networks which implement accessibility/co-accessibility and guarantee that only connected subnetworks will be selected during training. This solution allows enhancing generalization significantly, under very high pruning regimes, as corroborated through extensive experiments, involving graph convolutional networks, on the challenging task of skeleton-based action recognition.
Submission Number: 40