Learning Invariance with Compact TransformsDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: The problem of building machine learning models that admit efficient representations and also capture an appropriate inductive bias for the domain has recently attracted significant interest. Existing work for compressing deep learning pipelines has explored classes of structured matrices that exhibit forms of shift-invariance akin to convolutions. We leverage the displacement rank framework to automatically learn the structured class, allowing for adaptation to the invariances required for a given dataset while preserving asymptotically efficient multiplication and storage. In a setting with a small fixed parameter budget, our broad classes of structured matrices improve final accuracy by 5-7% on standard image classification datasets compared to conventional parameter constraining methods.
Keywords: structured matrices, low displacement rank, learning invariance, model compression
TL;DR: We leverage the displacement rank framework to automatically learn compact models, adapting to invariances in a given dataset while preserving efficient operations.
6 Replies

Loading