Distilled Pruning: Using Synthetic Data to Win the Lottery

Published: 06 Jul 2023, Last Modified: 15 Aug 2023AutoML 2023 (Workshop)EveryoneRevisionsBibTeX
TL;DR: This work introduces a novel approach to pruning deep learning models by using distilled data
Keywords: Neural Network Pruning, Data Distillation, Model Compression
Optional Submission Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Steps For Environmental Footprint Reduction During Development: Distilled Pruning is inherently designed to reduce the cost of retraining for pruning development by using a smaller, synthetic dataset.
Cpu Hours: 0
Gpu Hours: 150
Tpu Hours: 0
Evaluation Metrics: Yes
Estimated Co2e Footprint: 6.3
Submission Number: 8
Loading