A Monte Carlo Approach to Nonsmooth Convex Optimization via Proximal Splitting Algorithms

Published: 22 Sept 2025, Last Modified: 01 Dec 2025NeurIPS 2025 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Proximal, Operator Splitting, Derivative-Free, Zeroth-Order, Optimization, Monte Carlo, Hamilton-Jacobi
TL;DR: We introduce a universal framework for convex optimization splitting algorithms that replaces exact proximal operators with zeroth-order Hamilton-Jacobi approximations, maintaining convergence properties under mild assumptions.
Abstract: Operator splitting algorithms are a cornerstone of modern first-order optimization, relying critically on proximal operators as their fundamental building blocks. However, explicit formulas for proximal operators are available only for limited classes of functions, restricting the applicability of these methods. Recent work introduced HJ-Prox, a zeroth-order Monte Carlo approximation of the proximal operator derived from Hamilton–Jacobi PDEs, which circumvents the need for closed-form solutions. In this work, we extend the scope of HJ-Prox by establishing that it can be seamlessly incorporated into operator splitting schemes while preserving convergence guarantees. In particular, we show that replacing exact proximal steps with HJ-Prox approximations in algorithms such as proximal gradient descent, Douglas–Rachford splitting, Davis–Yin splitting, and the primal–dual hybrid gradient method still ensures convergence under mild conditions.
Submission Number: 76
Loading