Green Pruning: Layer Interdependence-Aware CNN Pruning for Resource Efficiency

ICLR 2026 Conference Submission21385 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Convolutional Neural Networks, Structured Filter Pruning, Model Compression Methods, Best Approximation, Resource Efficiency
Abstract: The rising computational demands of pruning algorithms have heightened challenges about their energy consumption and carbon footprint in convolutional neural networks. We address these challenges from two perspectives. First, we introduce new evaluation metrics for pruning: a Resource Efficiency (RE) metric, which quantifies the computational cost required to achieve a target accuracy, and a system-agnostic framework for assessing the relative carbon efficiency of pruning algorithms. Together, these metrics enable fair and consistent comparisons of pruning methods with respect to both efficiency and sustainability. Second, we present a \textbf{green pruning technique}, a data-free method that explicitly models inter-layer dependencies to provide a more reliable filter selection criterion. To further minimize computational overhead, our approach incorporates a low-complexity oblivious algorithm that leverages weak submodularity, ensuring efficiency without requiring iterative dataset passes.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 21385
Loading