FORKS: Fast Second-Order Online Kernel Learning using Incremental Sketching

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: zip
Primary Area: metric learning, kernel learning, and sparse coding
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Online Kernel Learning, Second-Order Method, Randomized Sketch
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose FORKS, a fast incremental randomized sketching method for second-order online kernel learning.
Abstract: Online Kernel Learning (OKL) has attracted considerable research interest due to its promising predictive performance. Second-order methods are particularly appealing for OKL as they often offer substantial improvements in regret guarantees. However, existing approaches like PROS-N-KONS suffer from at least quadratic time complexity with respect to the budget, rendering them unsuitable for meeting the real-time demands of large-scale online learning. Additionally, current OKL methods are typically prone to concept drifting in data streams, making them vulnerable in adversarial environments. To address these issues, we introduce FORKS, a fast incremental sketching approach for second-order online kernel learning. FORKS maintains an efficient time-varying explicit feature mapping that enables rapid updates and decomposition of sketches using incremental sketching techniques. Theoretical analysis demonstrates that FORKS achieves a logarithmic regret guarantee, on par with other second-order approaches, while maintaining a linear time complexity w.r.t. the budget. We validate the performance of FORKS through extensive experiments conducted on real-world datasets, demonstrating its superior scalability and robustness against adversarial attacks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8706
Loading