Near-Optimal Linear Regression under Distribution ShiftDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: minimax estimator, covariate shift, model shift
Abstract: Transfer learning is an essential technique when sufficient data comes from the source domain, while no or scarce labeled data is from the target domain. We develop estimators that achieve minimax linear risk for linear regression problems under the distribution shift. Our algorithms cover different kinds of settings with covariate shift or model shift. We also consider when data are generating from either linear or general nonlinear models. We show that affine minimax rules are within an absolute constant of the minimax risk even among nonlinear rules, for a variety of source/target distributions.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: We derive near optimal estimators for linear regression under distribution shift.
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=RVC3VzuUaK
8 Replies

Loading