The Simpler The Better: An Entropy-Based Importance Metric to Reduce Neural Networks’ Depth

Published: 21 Aug 2024, Last Modified: 05 Mar 2025Machine Learning and Knowledge Discovery in Databases. Research Track (ECML PKDD 2024)EveryoneCC BY 4.0
Abstract:

While deep neural networks are highly effective at solving complex tasks, large pre-trained models are commonly employed even to solve consistently simpler downstream tasks, which do not necessarily require a large model’s complexity. Motivated by the awareness of the ever-growing AI environmental impact, we propose an efficiency strategy that leverages prior knowledge transferred by large models. Simple but effective, we propose a method relying on an Entropy-bASed Importance mEtRic (EASIER) to reduce the depth of over-parametrized deep neural networks, which alleviates their computational burden. We assess the effectiveness of our method on traditional image classification setups. Our code is available at https://github.com/VGCQ/EASIER.

Loading