Risk Bounds for Mixture Density Estimation on Compact Domains via the $h$-Lifted Kullback--Leibler Divergence
Abstract: We consider the problem of estimating probability density functions based on sample data, using a finite mixture of densities from some component class. To this end, we introduce the h-lifted Kullback--Leibler (KL) divergence as a generalization of the standard KL divergence and a criterion for conducting risk minimization. Under a compact support assumption, we prove an $O(1/{\sqrt{n}})$ bound on the expected estimation error when using the h-lifted KL divergence, which extends the results of Rakhlin et al. (2005, ESAIM: Probability and Statistics, Vol. 9) and Li and Barron (1999, Advances in Neural Information ProcessingSystems, Vol. 12) to permit the risk bounding of density functions that are not strictly positive. We develop a procedure for the computation of the corresponding maximum h-lifted likelihood estimators (h-MLLEs) using the Majorization-Maximization framework and provide experimental results in support of our theoretical bounds.
Submission Length: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=KD1WTwEXHy¬eId=BKzBMK9ldu
Changes Since Last Submission: The manuscript has been anonymized and formatted consistently with the TMLR submission format.
Assigned Action Editor: ~Yair_Carmon1
Submission Number: 2752
Loading