Abstract
In Bayesian statistics, a model can be assessed by checking that the model fits the data, which is addressed by using the posterior predictive distribution for a discrepancy, an extension of classical test statistics to allow dependence on unknown (nuisance) parameters. Posterior predictive assessment of model fitness allows more direct assessment of the discrepancy between data and the posited model. The sensitivity analysis revealed that the effect of priors on parameter inferences is different from their effect on marginal density and predictive posterior distribution. In this paper, we explore the effect of the prior (or posterior) distribution on the corresponding posterior predictive distribution. The approximate sensitivity of the posterior predictive distribution is studied in terms of information measure including the Kullback-Leibler divergence. As an illustration, we applied these results to the simple spatial model settings.
| Original language | English |
|---|---|
| Pages (from-to) | 261-270 |
| Number of pages | 10 |
| Journal | Journal of the Korean Statistical Society |
| Volume | 44 |
| Issue number | 2 |
| DOIs | |
| State | Published - Jun 2015 |
Keywords
- Bayesian sensitivity
- Kullback-Leibler divergence
- Laplace approximation
- Posterior predictive distribution
Fingerprint
Dive into the research topics of 'Approximated sensitivity analysis in posterior predictive distribution'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver