Keywords: density ratio estimation, variational divergence optimization, Kullback–Leibler divergence, $f$-divergence, $L_p$ error, the curse of dimensionality, GAN
TL;DR: We provide both lower and upper bounds on $L_p$ errors in DRE, which hold for any member of a group of Lipschitz continuous estimators regardless of the specific $f$-divergence loss functions used.
Abstract: Density ratio estimation (DRE) is a core technique in machine learning used to capture relationships between two probability distributions.
$f$-divergence loss functions, which are derived from variational representations of $f$-divergence, have become a standard choice in DRE for achieving cutting-edge performance.
This study provides novel theoretical insights into DRE by deriving upper and lower bounds on the $L_p$ errors through $f$-divergence loss functions.
These bounds apply to any estimator belonging to a class of Lipschitz continuous estimators, irrespective of the specific $f$-divergence loss function employed.
The derived bounds are expressed as a product involving the data dimensionality and the expected value of the density ratio raised to the $p$-th power.
Notably, the lower bound includes an exponential term that depends on the Kullback--Leibler (KL) divergence, revealing that the $L_p$ error increases significantly as the KL divergence grows when $p > 1$.
This increase becomes even more pronounced as the value of $p$ grows.
The theoretical insights are validated through numerical experiments.
Supplementary Material: zip
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12415
Loading