Concentration inequalities under sub-Gaussian and sub-exponential conditionsDownload PDF

21 May 2021, 20:44 (modified: 22 Jan 2022, 13:31)NeurIPS 2021 PosterReaders: Everyone
Keywords: Statistical Learning Theory, Concentration Inequalities
TL;DR: Concentration Inequalities Under Sub-Gaussian and Sub-Exponential Conditions
Abstract: We prove analogues of the popular bounded difference inequality (also called McDiarmid's inequality) for functions of independent random variables under sub-gaussian and sub-exponential conditions. Applied to vector-valued concentration and the method of Rademacher complexities these inequalities allow an easy extension of uniform convergence results for PCA and linear regression to the case potentially unbounded input- and output variables.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
14 Replies