- Keywords: Energy-based models, Feature diversity, generalization
- TL;DR: We extend the probably approximately correct (PAC) theory of EBMs and analyze the effect of feature diversity on the performance of EBMs in different contexts.
- Abstract: Energy-based learning is a powerful learning paradigm that encapsulates various discriminative and generative approaches. An energy-based model (EBM) is typically formed of one (or many) inner-models which learn a combination of the different features to generate an energy mapping for each input configuration. In this paper, we focus on the diversity of the produced feature set. We extend the probably approximately correct (PAC) theory of EBMs and analyze the effect of the diversity on the performance of EBMs. We derive generalization bounds for various learning contexts, i.e., regression, classification, and implicit regression, with different energy functions and we show that indeed increasing the diversity of the feature set can consistently decrease the gap between the true and empirical expectation of the energy and boosts the performance of the model.