Technical Note - Improved Sample-Complexity Bounds in Stochastic Optimization

Published: 2025, Last Modified: 25 Jan 2026Oper. Res. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Real-world network-optimization problems often involve uncertain parameters during the optimization phase. Stochastic optimization is a key approach introduced in the 1950s to address such uncertainty. This paper presents improved upper bounds on the number of samples required for the sample-average approximation method in stochastic optimization. It enhances the sample complexity of existing approaches in this setting, providing faster approximation algorithms for any method that employs this framework. This work is particularly relevant for solving problems like the stochastic Steiner tree problem. Funding: The research of A. Baveja is partially supported by the United States Department of Transportation (via the University Transportation Research Center Program) [Grant 49198-25-26] and the British Council [the UKIERI Research Program]. The work of A. Chavan was done while he was a graduate student at the University of Maryland. The research of A. Srinivasan is partially supported by the National Science Foundation [Awards CNS 1010789, CCF-1422569, CCF-1749864, and CCF-1918749]; Adobe, Inc. [research awards]; Amazon, Inc.; and Google, Inc. The research of P. Xu was supported in part by the National Science Foundation [Awards CNS 1010789 and CCF-1422569 (when he was a graduate student)] and is partially funded by the National Science Foundation [CRII Award IIS-1948157]. Supplemental Material: The online appendix is available at https://doi.org/10.1287/opre.2018.0340.
Loading