Adaptive Convergence Rates for Log-Concave Maximum Likelihood
TL;DR: We show that for dimensions $d \geq 4$, the log-concave MLE attains adaptation.
Abstract: We study the task of estimating a log-concave density in $\mathbb{R}^d$ using the Maximum Likelihood Estimator, known as the log-concave MLE. We show that for every $d \geq 4$, the log-concave MLE attains an \emph{adaptive rate} when the negative logarithm of the underlying density is the maximum of $k$ affine functions, meaning that the estimation error for such a density is significantly lower than the minimax rate for the class of log-concave densities. Specifically, we prove that for such densities, the risk of the log-concave MLE is of order $c(k) \cdot n^{-\frac{4}{d}}$ in terms of the Hellinger squared distance. This result complements the work of (Kim et al. AoS 2018) and Feng et al. (AoS 2021), who addressed the cases $d = 1$ and $d \in \{2,3\}$, respectively. Our proof provides a unified and relatively simple approach for all $d \geq 1$, and is based on techniques from stochastic convex geometry and empirical process theory, which may be of independent interest.
Submission Number: 513
Loading