Estimating the Generalization in Deep Neural Networks via Sparsity

NeurIPS 2023 Workshop ATTRIB Submission11 Authors

Published: 27 Oct 2023, Last Modified: 08 Dec 2023ATTRIB PosterEveryoneRevisionsBibTeX
Keywords: Generalization, Deep Neural Networks, Sparsity
TL;DR: This paper present a novel method for estimating the generalization of DNNs based on the nature of sparsity on training resuslts.
Abstract: Generalization is the key capability for deep neural networks (DNNs). However, it is challenging to give a reliable measure of the generalization ability of a DNN via only its nature. In this paper, we propose a novel method for estimating the generalization gap based on network sparsity. Two key sparsity quantities are extracted from the training results alone, which could present close relationship with model generalization. Then a simple linear model involving two key quantities are constructed to give accurate estimation of the generalization gap. By training DNNs with a wide range of generalization gap on popular datasets, we show that our key quantities and linear model could be efficient tools for estimating the generalization gap of DNNs.
Submission Number: 11
Loading