Chefs' Random Tables: Non-Trigonometric Random FeaturesDownload PDF

Published: 31 Oct 2022, Last Modified: 04 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: random features, gaussian kernel, attention, Transformers
TL;DR: We present a new family of random features for the Gaussian kernel. Extensive theoretical and empirical analysis is presented.
Abstract: We introduce chefs' random tables (CRTs), a new class of non-trigonometric random features (RFs) to approximate Gaussian and softmax kernels. CRTs are an alternative to standard random kitchen sink (RKS) methods, which inherently rely on the trigonometric maps. We present variants of CRTs where RFs are positive, a key requirement for applications in recent low-rank Transformers. Further variance reduction is possible by leveraging statistics which are simple to compute. One instantiation of CRTs, the optimal positive random features (OPRFs), is to our knowledge the first RF method for unbiased softmax kernel estimation with positive and bounded RFs, resulting in exponentially small tails and much lower variance than its counterparts. As we show, orthogonal random features applied in OPRFs provide additional variance reduction for any dimensionality $d$ (not only asymptotically for sufficiently large $d$, as for RKS). We test CRTs on many tasks ranging from non-parametric classification to training Transformers for text, speech and image data, obtaining new state-of-the-art results for low-rank text Transformers, while providing linear space and time complexity.
Supplementary Material: zip
14 Replies

Loading