Predicting generalization with degrees of freedom in neural networksDownload PDF

Published: 15 Jun 2022, Last Modified: 05 May 2023ICML-AI4Science PosterReaders: Everyone
Keywords: generalization, compression
Abstract: Model complexity is fundamentally tied to predictive power in the sciences as well as in applications. However, there is a divergence between naive measures of complexity such as parameter count and the generalization performance of over-parameterized machine learning models. Prior empirical approaches to capturing intrinsic complexity in a more sophisticated manner than parameter count are computationally intractable, do not capture the implicitly regularizing effects of the entire machine-learning pipeline, or do not provide a quantitative fit to the double descent behavior of overparameterized models. In this work, we introduce an empirical complexity measure inspired by the classical notion of generalized degrees of freedom in statistics. This measure can be approximated efficiently and is a function of the entire machine learning training pipeline. We demonstrate that this measure correlates with generalization performance in the double-descent regime.
Track: Original Research Track
0 Replies

Loading