Discriminative reconstruction via simultaneous dense and sparse coding

TMLR Paper2406 Authors

21 Mar 2024 (modified: 28 Mar 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Discriminative features extracted from the sparse coding model have been shown to perform well for classification. Recent deep learning architectures have further improved reconstruction in inverse problems by considering new dense priors learned from data. We propose a novel dense and sparse coding model that integrates both representation capability and discriminative features. The model studies the problem of recovering a dense vector x and a sparse vector u given measurements of the form y = Ax+Bu. Our first analysis proposes a geometric condition based on the minimal angle between spanning subspaces corresponding to the matrices A and B that guarantees unique solution to the model. The second analysis shows that, under mild assumptions, a convex program recovers the dense and sparse components. We validate the effectiveness of the model on simulated data and propose a dense and sparse autoencoder (DenSaE) tailored to learning the dictionaries from the dense and sparse model. We demonstrate that (i) DenSaE denoises natural images better than architectures derived from the sparse coding model (Bu), (ii) in the presence of noise, training the biases in the latter amounts to implicitly learning the Ax + Bu model, (iii) A and B capture low- and high-frequency contents, respectively, and (iv) compared to the sparse coding model, DenSaE offers a balance between discriminative power and representation.
Submission Length: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=MmfBxdMDoo
Changes Since Last Submission: The previous submission was marked as a regular submission despite being more than 12 pages of main content. As a result, it was rejected. We have now marked the revised submission as a long submission.
Assigned Action Editor: ~Moshe_Eliasof1
Submission Number: 2406
Loading