Sparse Recovery via Bootstrapping: Collaborative or Independent?Download PDF

Anonymous

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Desk Rejected SubmissionReaders: Everyone
Keywords: LASSO, Bootstrapping, Bagging, sparsity, group sparsity
Abstract: Sparse regression problems have traditionally been solved using all available measurements simultaneously. However, this approach fails in challenging scenarios such as when the noise level is high or there are missing data / adversarial samples. We propose JOBS (Joint-Sparse Optimization via Bootstrap Samples) -- a \emph{collaborative} sparse-regression framework on bootstrapped samples from the pool of available measurements via a joint-sparse constraint to ensure support consistency. In comparison to traditional bagging which solves sub-problems in an \emph{independent} fashion across bootstrapped samples, JOBS achieves state-of-the-art performance with the added advantage of having a sparser solution while requiring a lower number of observation samples. Analysis of theoretical performance limits is employed to determine critical optimal parameters: the number of bootstrap samples $K$ and the number of elements $L$ in each bootstrap sample. Theoretical results indicate a better bound than Bagging (i.e. higher probability of achieving the same or better performance). Simulation results are used to validate this parameter selection. JOBS is robust to adversarial samples that fool the baseline method, as shown by better generalization in an image reconstruction task where the adversary has similar occlusions or alignment as the test sample. Furthermore, JOBS also improves discriminative performance in a facial recognition task in a sparse-representation-based classification setting.
One-sentence Summary: We propose and study the theoretical properties of a collaborative bootstrapping framework named JOBS that is better than the baseline LASSO and Bagging methods in terms of robustness and generalization, especially during challenging cases.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Supplementary Material: zip
1 Reply

Loading