Sparse Contextual CDF Regression

Published: 08 Jul 2024, Last Modified: 08 Jul 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Estimating cumulative distribution functions (CDFs) of context-dependent random variables is a central statistical task underpinning numerous applications in machine learning and economics. In this work, we extend a recent line of theoretical inquiry into this domain by analyzing the problem of \emph{sparse contextual CDF regression}, wherein data points are sampled from a convex combination of $s$ context-dependent CDFs chosen from a set of $d$ basis functions. We show that adaptations of several canonical regression methods serve as tractable estimators in this functional sparse regression setting under standard assumptions on the conditioning of the basis functions. In particular, given $n$ data samples, we prove estimation error upper bounds of $\tilde{O}(\sqrt{s/n})$ for functional versions of the lasso and Dantzig selector estimators, and $\tilde{O}(\sqrt{s}/\sqrt[4]{n})$ for a functional version of the elastic net estimator. Our results match the corresponding error bounds for finite-dimensional regression and improve upon CDF ridge regression which has $\tilde{O}(\sqrt{d/n})$ estimation error. Finally, we obtain a matching information-theoretic lower bound which establishes the minimax optimality of the lasso and Dantzig selector estimators up to logarithmic factors.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/enchainingrealm/SparseContextualCDFRegression
Assigned Action Editor: ~Lihong_Li1
Submission Number: 2336
Loading