Learning-Augmented Sketches for Hessians Download PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: least squares, convex optimization, iterative Hessian sketch, subspace embedding, learning-augmented sketch
Abstract: Sketching is a dimensionality reduction technique where one compresses a matrix by linear combinations that are typically chosen at random. A line of work has shown how to sketch the Hessian to speed up each iteration in a second order method, but such sketches usually depend only on the matrix at hand, and in a number of cases are even oblivious to the input matrix. One could instead hope to learn a distribution on sketching matrices that is optimized for the specific distribution of input matrices. We show how to design learned sketches for the Hessian in the context of second order methods. We prove that a smaller sketching dimension of the column space of a tall matrix is possible, assuming the knowledge of the indices of the rows of large leverage scores. This would lead to faster convergence of the iterative Hessian sketch procedure. We also design a new objective to learn the sketch, whereby we optimize the subspace embedding property of the sketch. We show empirically that learned sketches, compared with their "non-learned" counterparts, do improve the approximation accuracy for important problems, including LASSO and matrix estimation with nuclear norm constraints.
One-sentence Summary: We design learned sketches in the context of second order methods and demonstrate their advantages empirically.
Supplementary Material: zip
15 Replies

Loading