Objective Bayesian variable selection in linear regression model

Sang Gil Kang, Dal Ho Kim, Woo Dong Lee, Yongku Kim

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Variable selection in a regression model with k potential explanatory variables requires the choosing of a model among the possible (Formula presented.) submodels, which is a difficult task when the number of explanatory variables is moderately large. In this study, we propose the objective Bayesian variable selection procedures where the encompassing of the underlying nonnested linear models is crucial. Based on the encompassed models, objective priors for the multiple testing problem involved in the variable selection problem can be defined. The proposed approach provides a considerable reduction in the size of the compared models by restricting the posterior search for the right models, from (Formula presented.) to only k + 1, given k explanatory variables. Furthermore, the consistency of the proposed variable selection procedures was checked and their performance was examined using real examples and simulation analyzes by comparing the classical and Bayesian procedures of search in all possible submodels.

Original languageEnglish
Pages (from-to)1133-1157
Number of pages25
JournalJournal of Statistical Computation and Simulation
Volume92
Issue number6
DOIs
StatePublished - 2022

Keywords

  • Bayes factor
  • consistency
  • encompassing
  • intrinsic prior
  • linear regression model
  • variable selection

Fingerprint

Dive into the research topics of 'Objective Bayesian variable selection in linear regression model'. Together they form a unique fingerprint.

Cite this