Asymptotic Slowing Down of the Nearest-Neighbor ClassifierDownload PDFOpen Website

1990 (modified: 11 Nov 2022)NIPS 1990Readers: Everyone
Abstract: If patterns are drawn from an n-dimensional feature space according to a probability distribution that obeys a weak smoothness criterion, we show that the probability that a random input pattern is misclassified by a nearest-neighbor classifier using M random reference patterns asymptoti(cid:173) cally satisfies PM(error) "" Poo(error) + M2/n' a for sufficiently large values of M. Here, Poo(error) denotes the probability of error in the infinite sample limit, and is at most twice the error of a Bayes classifier. Although the value of the coefficient a depends upon the underlying probability distributions, the exponent of M is largely distri(cid:173) bution free. We thus obtain a concise relation between a classifier's ability to generalize from a finite reference sample and the dimensionality of the feature space, as well as an analytic validation of Bellman's well known "curse of dimensionality."
0 Replies

Loading