PAC-Bayes and Information ComplexityDownload PDF

Mar 04, 2021 (edited Apr 02, 2021)Neural Compression Workshop @ ICLR 2021Readers: Everyone
  • Keywords: PAC-Bayes, generalization bounds, Gibbs algorithm, mutual information, Information complexity, Entropy-SGD, flat minima
  • TL;DR: We present a common framework for deriving PAC-Bayesian and information-theoretic generalization bounds.
  • Abstract: We point out that a number of well-known PAC-Bayesian-style and information-theoretic generalization bounds for randomized learning algorithms can be derived under a common framework starting from a fundamental information exponential inequality. We also obtain new bounds for data-dependent priors and unbounded loss functions. Optimizing these bounds naturally gives rise to a method called Information Complexity Minimization for which we discuss two practical examples for learning with neural networks, namely Entropy- and PAC-Bayes- SGD.
1 Reply