Keywords: Robust Optimization, Uncertainty Set, Data-Driven, Prescriptive Analytics, Distribution-Free
TL;DR: We turn conformally calibrated uncertainty into decision-ready robust and satisficing policies, improving operational performance while meeting a user-specified reliability level.
Abstract: High-capacity black-box predictors are increasingly used to inform business operational decisions, yet optimizing directly over point forecasts can yield fragile solutions. We develop a conformal framework that converts any predictor into calibrated, context-dependent uncertainty sets for robust decision-making. Our framework supports two paradigms: Conformal Robust Optimization (CRO), which constructs data-driven uncertainty sets calibrated to a coverage level $\alpha$, and Conformal Robust Satisficing (CRS), which minimizes fragility relative to a target $\tau$ over the full support. We prove that CRO and CRS are equivalent under mild conditions, provide finite-sample coverage guarantees concentrating at $O(n^{-1/2})$, and derive suboptimality bounds quantifying the value of prediction accuracy. Experiments on a fractional knapsack problem demonstrate that CRO improves objectives relative to baselines while maintaining calibrated coverage, and CRS improves feasibility over distributionally robust satisficing benchmarks.
Submission Number: 10
Loading