TL;DR: This paper characterizes when a convex M-estimation or stochastic optimization problem is solvable without making any assumptions on the underlying data distribution.
Abstract: The basic question of delineating those statistical problems that
are solvable without making any assumptions on the underlying
data distribution has long animated statistics and learning theory.
This paper characterizes when a convex M-estimation or
stochastic optimization problem is solvable in such an assumption-free
setting, providing a precise dividing line between solvable and unsolvable
problems.
The conditions we identify show that Lipschitz
continuity of the loss being minimized is not necessary for distribution
free minimization, and they are also distinct from classical characterizations
of learnability in machine learning.
Code Dataset Promise: No
Signed Copyright Form: pdf
Format Confirmation: I agree that I have read and followed the formatting instructions for the camera ready version.
Submission Number: 1562
Loading