Efficient label-free pruning and retraining for Text-VQA Transformers

Published: 01 Jan 2024, Last Modified: 13 May 2025Pattern Recognit. Lett. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We study a label-free importance score for structured pruning of autoregressive Transformers.•We propose an adaptive retraining approach for pruned Transformer models of varying sizes.•Our pruned model achieve up to 60% reduction in size with only ¡2.4% drop in accuracy.
Loading