Sparse optimization guided pruning for neural networks

Published: 01 Jan 2024, Last Modified: 15 May 2025Neurocomputing 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We propose SOGP to enhance the connection between pretraining and pruning.•We propose a novel non-convex group sparse regularization GTl1 .•We construct a unified optimization model to integrate pruning and fine-tuning.•Extensive experiments validate the good pruning and accuracy performance of SOGP.
Loading