Random Feature Mean-Shift

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Machine Learning, Mode-seeking, Kernel Density Estimation, Random Feature Method, Zeroth-Order Optimization
TL;DR: This paper proposes Random Feature Mean-Shift (RFMS), a novel linear complexity density-based mode-seeking algorithm.
Abstract: Locating the modes of a probability density function is a fundamental problem in many areas of machine learning. However, classical mode-seeking algorithms such as mean-shift and its variants exhibit quadratic complexity with respect to the number of data points due to exhaustive pairwise kernel computation - a well-known bottleneck that severely restricts the applicability. In this paper, we propose **Random Feature mean-shift (RFMS)**, a novel linear complexity mode-seeking algorithm. We give a sampling-based estimator using random feature kernel approximation and zeroth-order gradient method that allows us to provably achieve linear runtime per iteration, with comprehensive theoretical guarantees for mode estimation and convergence behavior. Empirical evaluations on clustering and pixel-level image segmentation tasks show RFMS is up to 12x faster when compared with other mean-shift variants, offering substantial efficiency gains while producing near-optimal results. Overall, RFMS offers a practical and principled framework for scalable mode-seeking beyond kernel-value approximation, with explicit guarantees on the induced mode landscape and optimization dynamics.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 12056
Loading