Reproducibility in Optimization: Theoretical Framework and LimitsDownload PDF

Published: 31 Oct 2022, Last Modified: 15 Dec 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: reproducibility, first-order optimization, convex optimization, inexact gradient oracles
TL;DR: We initiate a formal study of reproducibility in optimization by defining a quantitative measure and characterizing the fundamental limits for various settings.
Abstract: We initiate a formal study of reproducibility in optimization. We define a quantitative measure of reproducibility of optimization procedures in the face of noisy or error-prone operations such as inexact or stochastic gradient computations or inexact initialization. We then analyze several convex optimization settings of interest such as smooth, non-smooth, and strongly-convex objective functions and establish tight bounds on the limits of reproducibility in each setting. Our analysis reveals a fundamental trade-off between computation and reproducibility: more computation is necessary (and sufficient) for better reproducibility.
Supplementary Material: pdf
11 Replies