Keywords: Regularisation, Large Class Size, Computational learning theory
TL;DR: Characteristic Function based Regularisation
Abstract: Regularization plays a crucial role in neural network training by preventing overfitting and improving generalization. In this paper, we introduce a novel regularization technique grounded in the properties of characteristic functions, leveraging assumptions from decomposable distributions and the central limit theorem. Rather than replacing traditional regularization methods such as L2 or dropout, our approach is designed to supplement them, providing a contextual delta of generalization. We demonstrate that integrating this method into standard architectures improves performance on benchmark datasets by preserving essential distributional properties and mitigating the risk of overfitting. This characteristic function-based regularization offers a new perspective in the direction of distribution-aware learning in machine learning models.
Primary Area: learning theory
Submission Number: 3433
Loading