Abstract: We study distributionally robust optimization with Sinkhorn distance: a variant of Wasserstein distance based on entropic regularization. We derive a convex programming dual reformulation for general nominal distributions, transport costs, and loss functions. To solve the dual reformulation, we develop a stochastic mirror descent algorithm with biased subgradient estimators and derive its computational complexity guarantees. Finally, we provide numerical examples using synthetic and real data to demonstrate its superior performance. Funding: The work of Y. Xie is partially supported by the National Science Foundation [Grant DMS-2134037] and the Coca-Cola Foundation. Supplemental Material: All supplemental materials, including the code, data, and files required to reproduce the results, are available at https://doi.org/10.1287/opre.2023.0294.
External IDs:doi:10.1287/opre.2023.0294
Loading