Keywords: differentiable optimization
TL;DR: A novel differentiable unrolling optimization layer is designed to enhance the speed of both optimization and backpropagation processes, especially in the presence of norm constraints.
Abstract: Differentiable optimization has received a significant amount of attention due to its foundational role in the domain of machine learning based on neural networks. This paper proposes a differentiable layer, named Differentiable Frank-Wolfe Layer (DFWLayer), by rolling out the Frank-Wolfe method, a well-known optimization algorithm which can solve constrained optimization problems without projections and Hessian matrix computations, thus leading to an efficient way of dealing with large-scale convex optimization problems with norm constraints. Experimental results demonstrate that the DFWLayer not only attains competitive accuracy in solutions and gradients but also consistently adheres to constraints.
Supplementary Material: pdf
Submission Number: 99
Loading