A Simple Algorithm For Scaling Up Kernel MethodsDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 12 May 2023CoRR 2023Readers: Everyone
Abstract: The recent discovery of the equivalence between infinitely wide neural networks (NNs) in the lazy training regime and Neural Tangent Kernels (NTKs) (Jacot et al., 2018) has revived interest in kernel methods. However, conventional wisdom suggests kernel methods are unsuitable for large samples due to their computational complexity and memory requirements. We introduce a novel random feature regression algorithm that allows us (when necessary) to scale to virtually infinite numbers of random features. We illustrate the performance of our method on the CIFAR-10 dataset.
0 Replies

Loading