Discriminative Densities from Maximum Contrast EstimationDownload PDFOpen Website

2002 (modified: 11 Nov 2022)NIPS 2002Readers: Everyone
Abstract: We propose a framework for classifier design based on discriminative densities for representation of the differences of the class-conditional dis- tributions in a way that is optimal for classification. The densities are selected from a parametrized set by constrained maximization of some objective function which measures the average (bounded) difference, i.e. the contrast between discriminative densities. We show that maximiza- tion of the contrast is equivalent to minimization of an approximation of the Bayes risk. Therefore using suitable classes of probability den- sity functions, the resulting maximum contrast classifiers (MCCs) can approximate the Bayes rule for the general multiclass case. In particular for a certain parametrization of the density functions we obtain MCCs which have the same functional form as the well-known Support Vec- tor Machines (SVMs). We show that MCC-training in general requires some nonlinear optimization but under certain conditions the problem is concave and can be tackled by a single linear program. We indicate the close relation between SVM- and MCC-training and in particular we show that Linear Programming Machines can be viewed as an approxi- mate realization of MCCs. In the experiments on benchmark data sets, the MCC shows a competitive classification performance.
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview