A Novel Kernel Sparse Coding Method with A Two-stage Acceleration Strategy

27 Sept 2024 (modified: 23 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Sparse Coding, Kernel Trick, Acceleration Strategy
Abstract: Sparse coding aims to exploit the latent linear structure of the input data, transforming dense data into sparse data, thereby improving data processing efficiency. However, many real-word signals cannot be expressed linearly, rendering the traditional sparse coding algorithms ineffective. One potential solution is to expand the dimensions of data. In this paper, we verify that the feature mapping of Radial Basis Function (RBF) kernel contains infinite dimensional information, and it does not significantly increase the computational complexity. Based on this, we propose to explore the $l_1$-norm regularization sparse coding method with RBF kernel, and provides a solution with convergence guarantees by leveraging the principle of coordinate descent. Additionally, to accelerate the optimization process, we introduce a novel two-stage acceleration strategy, based on theoretical analysis and empirical observations. Experimental results demonstrate that the two-stage acceleration strategy can reduce processing time by up to 90\%. Furthermore, when the data size is compressed to about 2\% of its original scale, the NMAE metric of the proposed method reaches as low as 0.0824 to 0.2195, achieving a significant improvement of up to 47\% compared to traditional linear sparse coding methods and 36\% compared to other kernel sparse coding techniques.
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9006
Loading