Level Set Teleportation: the Good, the Bad, and the Ugly

Published: 26 Oct 2023, Last Modified: 13 Dec 2023NeurIPS 2023 Workshop PosterEveryoneRevisionsBibTeX
Keywords: teleportation, initializations, level sets, gradient descent, non-linear optimization
TL;DR: We develop an algorithm for level-set teleportation and leverage it to show teleporting can accelerate gradient-based optimization methods in limited circumstances.
Abstract: We study level set teleportation, a sub-routine which seeks to accelerate gradient methods by maximizing the gradient over the set of parameters with the same objective value. Since the descent lemma implies that gradient descent (GD) decreases the objective proportional to the gradient-norm, level-set teleportation maximizes guaranteed one-step progress. We prove level-set teleportation neither improves nor worsens the convergence of GD for strongly convex functions, while for convex functions teleportation can arbitrarily increase the distance to the global minima. To solve teleportation problems, we develop a projected-gradient-type method requiring only Hessian-vector products; we use our method to show that initializing GD with teleportation slightly under-performs standard initializations for both convex and non-convex optimization problems. As a result, we report a mixed picture: teleportation can be efficiently evaluated, but appears to offer marginal gains.
Submission Number: 35
Loading