Interpolating Compressed Parameter SubspacesDownload PDF

04 Oct 2022, 23:09 (modified: 26 Nov 2022, 09:49)NeurIPS 2022 Workshop MetaLearn PosterReaders: Everyone
Abstract: Though distribution shifts have caused growing concern for machine learning scalability, solutions tend to specialize towards a specific type of distribution shift. We learn that constructing a Compressed Parameter Subspaces (CPS), a geometric structure representing distance-regularized parameters mapped to a set of train-time distributions, can maximize average accuracy over a broad range of distribution shifts concurrently. We show sampling parameters within a CPS can mitigate backdoor, adversarial, permutation, stylization and rotation perturbations. Regularizing a hypernetwork with CPS can also reduce task forgetting.
0 Replies

Loading