Keywords: Convex Conjugates, Legendre Transformation, Deep Learning, Input Convex Neural Networks
TL;DR: We introduce a deep learning method for calculating convex conjugates in high-dimensional settings, significantly enhancing computational efficiency and providing L2-convergence certificates.
Abstract: We introduce a novel deep learning algorithm for computing convex conjugates of differentiable convex functions, a fundamental operation in convex analysis with various applications in different fields such as optimization, control theory, physics and economics. While traditional numerical methods suffer from the curse of dimensionality and become computationally intractable in high dimensions, more recent neural network-based approaches scale better, but have mostly been studied with the aim of solving optimal transport problems and require the solution of complicated optimization or max-min problems. Using an implicit Fenchel formulation of convex conjugation, our approach facilitates an efficient gradient-based framework for the minimization of approximation errors and, as a byproduct, also provides a posteriori error estimates for the approximation quality. Numerical experiments demonstrate our method's ability to deliver accurate results across different high-dimensional examples. Moreover, by employing symbolic regression with Kolmogorov–Arnold networks, it is able to obtain the exact convex conjugates of specific convex functions.
Primary Area: Optimization (e.g., convex and non-convex, stochastic, robust)
Submission Number: 12482
Loading