On a Gradient Approach to Optimal Function Learning via Chebyshev Centers

TMLR Paper6004 Authors

26 Sept 2025 (modified: 21 Nov 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We introduce $\textsf{gradOL}$, the first gradient-based optimization framework for solving Chebyshev center problems, a fundamental challenge in optimal function learning and geometric optimization. By leveraging automatic differentiation for precise (sub-)gradient computation, $\textsf{gradOL}$ ensures numerical stability and scalability, making it suitable for large-scale settings. Under strong convexity of the ambient norm, our method provably recovers optimal Chebyshev centers while directly computing the associated radius. This addresses a key bottleneck in constructing stable optimal interpolants. Empirically, $\textsf{gradOL}$ achieves significant improvements in accuracy and efficiency on 34 benchmark Chebyshev center problems from the $\textsf{CSIP}$ library. Furthermore, we extend our approach to general convex semi-infinite programming (CSIP), attaining up to $4000\times$ speedups over the state-of-the-art $\textsf{SIPAMPL}$ solver across 67 benchmark instances. Our work also provides the first theoretical foundation for applying gradient-based methods to Chebyshev center problems, bridging rigorous analysis with practical algorithms. $\textsf{gradOL}$ thus offers a unified solution framework for Chebyshev centers and broader CSIPs.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Zhiyu_Zhang1
Submission Number: 6004
Loading