PTCP: Alleviate Layer Collapse in Pruning at Initialization via Parameter Threshold Compensation and Preservation
Abstract: Over-parameterized neural networks have good performance, but training such networks is computationally expensive. Pruning at initialization (PaI) avoids training a full network, which has attracted intense interest. But at high compression ratios, layer collapse severely compromises the performance of PaI. Existing methods introduce operations such as iterative pruning to alleviate layer collapse. However, these operations require additional computing and memory costs. In this paper, we focus on alleviating layer collapse without increasing cost. Therefore, we propose an efficient strategy called parameter threshold compensation. This strategy constrains the lower limit of network layer parameters and uses parameter transfer to compensate for layers with fewer parameters. To promote a more balanced transfer of parameters, we further propose a parameter preservation strategy, using the average number of preserved parameters to more strongly constrain the layers that reduce parameters. We conduct extensive experiments on five pruning methods on Cifar10 and Cifar100 datasets using VGG16 and ResNet18 architectures, verifying the effectiveness of our strategy. Furthermore, we compare the improved performance with two SOTA methods. The comparison results show that our strategy achieves similar performance, challenging the design of increasingly complex pruning strategies.
0 Replies
Loading