A Comparison of Objective Bayes Factors for Variable Selection in Linear Regression Models

Published: 2013, Last Modified: 09 Oct 2024Statistical Models for Data Analysis 2013EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper deals with the variable selection problem in linear regression models and its solution by means of Bayes factors. If substantive prior information is lacking or impractical to elicit, which is often the case in applications, objective Bayes factors come into play. These can be obtained by means of different methods, featuring Zellner–Siow priors, fractional Bayes factors and intrinsic priors. The paper reviews such methods and investigates their finite-sample ability to identify the simplest model supported by the data, introducing the notion of full discrimination power. The results obtained are relevant to structural learning of Gaussian DAG models, where large spaces of sets of recursive linear regressions are to be explored.
Loading