Secant Line Search for Frank-Wolfe Algorithms

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: A new line search for Frank-Wolfe algorithms based on the secant method
Abstract: We present a new step-size strategy based on the secant method for Frank-Wolfe algorithms. This strategy, which requires mild assumptions about the function under consideration, can be applied to any Frank-Wolfe algorithm. It is as effective as full line search and, in particular, allows for adapting to the local smoothness of the function, such as in (Pedregosa et al., 2020), but comes with a significantly reduced computational cost, leading to higher effective rates of convergence. We provide theoretical guarantees and demonstrate the effectiveness of the strategy through numerical experiments.
Lay Summary: We’ve developed a new way to make a certain type of optimization method (called the Frank-Wolfe algorithm) work faster and more efficiently. Think of it like finding the best path down a mountain - our method helps the computer take better steps without having to check every possible direction, which saves a lot of time and computing power. This matters in machine learning because many models rely on solving these kinds of optimization problems to learn from data — making this process faster means training models more quickly and using less computing power. The great part is that it works with many versions of this method and still finds solutions just as well as older, more time-consuming approaches. But because our method is much quicker, it reaches good solutions faster in practice. We've proven mathematically that this method works well, and we've tested it with real examples to show that it performs as promised.
Link To Code: https://github.com/ZIB-IOL/FrankWolfe.jl
Primary Area: Optimization->Convex
Keywords: First-order methods, Frank-Wolfe algorithms, Secant
Submission Number: 7056
Loading