Keywords: homeomorphism, reparameterization, non-convex constraint, projection, invertible nerual network
TL;DR: Hom-PGD+ transforms non-convex constrained optimization into ball-constrained problems via learned homeomorphisms, delivering significant speedups while maintaining theoretical guarantees.
Abstract: Optimization over general non-convex constraint sets poses significant computational challenges due to their inherent complexity.
In this paper, we focus on optimization problems over non-convex constraint sets that are homeomorphic to a ball, which encompasses important problem classes such as star-shaped sets that frequently arise in machine learning and engineering applications.
We propose \textbf{Hom-PGD$^+$}, a fast, \textit{learning-based} and \textit{projection-efficient} first-order method that efficiently solves such optimization problems without requiring expensive projection or optimization oracles.
Our approach leverages an invertible neural network (INN) to learn the homeomorphism between the non-convex constraint set and a unit ball, transforming the original problem into an equivalent ball-constrained optimization problem. This transformation enables fast projection-efficient optimization while preserving the fundamental structure of the original problem. We establish that Hom-PGD$^+$ achieves an $\mathcal{O}(\epsilon^{-2})$ convergence rate to obtain an $\epsilon + \mathcal{O}(\sqrt{\epsilon_{\rm inn}})$-approximate stationary solution, where $\epsilon_{\rm inn}$ denotes the homeomorphism learning error.
This convergence rate represents a significant improvement over existing methods for optimization over non-convex sets. Moreover, Hom-PGD$^+$ maintains a per-iteration computational complexity of $\mathcal{O}(W)$, where $W$ is the number of INN parameters. Extensive numerical experiments, including chance-constrained optimization popular in power systems, demonstrate that Hom-PGD$^+$ achieves convergence rates comparable to state-of-the-art methods while delivering speedups of up to one order of magnitude.
Primary Area: optimization
Submission Number: 9457
Loading