An Error Analysis of Deep Density-Ratio Estimation with Bregman DivergenceDownload PDF

16 May 2022 (modified: 05 May 2023)NeurIPS 2022 SubmittedReaders: Everyone
Keywords: Curse of dimensionality, error analysis, KL divergence, telescoping density ratio estimator
TL;DR: This paper establishes non-asymptotic error bounds for nonparametric density-ratio estimators using deep neural networks with the Bregman divergence.
Abstract: We establish non-asymptotic error bounds for a nonparametric density-ratio estimator using deep neural networks with the Bregman divergence. We also show that the deep density-ratio estimator can mitigate the curse of dimensionality when the data is supported on an approximate low-dimensional manifold. Our error bounds are optimal in the minimax sense and the pre-factors in our error bounds depend on the dimensionality of the data polynomially. We apply our results to investigate the convergence properties of the telescoping density-ratio estimator (Rhodes et al., 2020) and provide sufficient conditions under which it has a smaller upper error bound than a single-ratio estimator.
Supplementary Material: zip
21 Replies

Loading