Abstract
Variable selection in a regression model with k potential explanatory variables requires the choosing of a model among the possible (Formula presented.) submodels, which is a difficult task when the number of explanatory variables is moderately large. In this study, we propose the objective Bayesian variable selection procedures where the encompassing of the underlying nonnested linear models is crucial. Based on the encompassed models, objective priors for the multiple testing problem involved in the variable selection problem can be defined. The proposed approach provides a considerable reduction in the size of the compared models by restricting the posterior search for the right models, from (Formula presented.) to only k + 1, given k explanatory variables. Furthermore, the consistency of the proposed variable selection procedures was checked and their performance was examined using real examples and simulation analyzes by comparing the classical and Bayesian procedures of search in all possible submodels.
Original language | English |
---|---|
Pages (from-to) | 1133-1157 |
Number of pages | 25 |
Journal | Journal of Statistical Computation and Simulation |
Volume | 92 |
Issue number | 6 |
DOIs | |
State | Published - 2022 |
Keywords
- Bayes factor
- consistency
- encompassing
- intrinsic prior
- linear regression model
- variable selection