Bayesian Learning for Classification using a Uniform Dirichlet PriorDownload PDFOpen Website

Published: 01 Jan 2019, Last Modified: 12 May 2023GlobalSIP 2019Readers: Everyone
Abstract: In Bayesian learning, designs based on noninformative priors are appropriate when the user cannot confidently identify the data-generating distribution. While such learners cannot achieve the performance of those based on a well-matched subjective prior, they impart a robustness against poor prior selection. The uniform Dirichlet distribution is the true non-informative prior as it has full support over the space of candidate distributions; additionally, it leads to closed-form posteriors. This work applies such a prior to classification using the 0-1 loss, determines the optimal Bayes classifier and the corresponding minimum probability of error, and analyzes the results.
0 Replies

Loading