Consistent Nonparametric Density Estimation with Neural Networks: A Unary Classification Approach

Published: 15 Mar 2026, Last Modified: 15 Mar 20262026 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: density estimation, unary classification, neural networks, multilayer perceptron
Abstract: A consistent nonparametric method for probability density estimation using neural networks is presented. The core idea is to train a multilayer perceptron (MLP) with piecewise linear hidden activations as a unary classifier, distinguishing true data from uniform background noise. The learned classifier's output probability is analytically transformed into a density estimate, guaranteeing statistical reliability based on the theory of unary classification. This framework provides a flexible, neural network-based alternative to classical nonparametric estimators, such as histograms, kernel density methods, and k-nearest neighbors (k-NN). Once trained, the estimator has constant time complexity O(1) per evaluation, independent of the training set size. The proposed estimator is quantitatively evaluated on synthetic distributions with known analytical forms, using Kullback-Leibler (KL) divergence as the principal performance metric. Experimental results demonstrate that the neural density estimator not only enjoys a solid theoretical foundation but also achieves lower KL divergence compared to standard baseline methods, establishing it as a powerful and trustworthy tool for nonparametric density estimation.
Submission Number: 11
Loading