Data Sampling Affects the Complexity of Online SGD over Dependent DataDownload PDF

Published: 20 May 2022, Last Modified: 22 Oct 2023UAI 2022 PosterReaders: Everyone
Keywords: Dependent data, SGD, sample complexity, optimization
TL;DR: This paper shows that both SGD with periodic data-subsampling and mini-batch SGD can improve the sample complexity of vanilla SGD under dependent data.
Abstract: Conventional machine learning applications typically assume that data samples are independently and identically distributed (i.i.d.). However, practical scenarios often involve a data-generating process that produces highly dependent data samples, which are known to heavily bias the stochastic optimization process and slow down the convergence of learning. In this paper, we conduct a fundamental study on how different stochastic data sampling schemes affect the sample complexity of online stochastic gradient descent (SGD) over highly dependent data. Specifically, with a ϕ-mixing model of data dependence, we show that online SGD with proper periodic data-subsampling achieves an improved sample complexity over the standard online SGD in the full spectrum of the data dependence level. Interestingly, even subsampling a subset of data samples can accelerate the convergence of online SGD over highly dependent data. Moreover, we show that online SGD with mini-batch sampling can further substantially improve the sample complexity over online SGD with periodic data subsampling over highly dependent data. Numerical experiments validate our theoretical results.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2204.00006/code)
4 Replies

Loading