Investigating the relationship between diversity and generalization in deep neural networks

Published: 05 Nov 2025, Last Modified: 05 Nov 2025NLDL 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: implicit ensembles, node-level classifiers, diversity, regularization, generalization
TL;DR: Single network as an implicit ensemble: diversity rises with depth and—when induced (dropout/dropconnect/batch size)—correlates with test accuracy. Disagreement is strongest; double-fault inversely; Q/entropy are dataset-dependent.
Abstract: In ensembles, improved generalization is frequently attributed to \emph{diversity} among members of the ensemble. By viewing a single neural network as an implicit ensemble, we apply well-known ensemble diversity measures to study the relationship between diversity and generalization in artificial neural networks. Our results show that i) deeper layers of the network have higher levels of diversity and ii) layer-wise accuracy positively correlates with diversity. Additionally, we study the effects of well-known regularizers such as Dropout, DropConnect and batch size, on diversity and generalization. We generally find that increasing the strength of the regularizer increases the diversity in the neural network and this increase in diversity is positively correlated with model accuracy. We show that these results hold for several benchmark datasets (such as Fashion-MNIST and CIFAR-10) and architectures (MLPs and CNNs). Our findings suggest new avenues of research into the generalization ability of deep neural networks.
Serve As Reviewer: ~Randle_Rabe1
Submission Number: 60
Loading