Approximated sensitivity analysis in posterior predictive distribution

Yongku Kim, L. Mark Berliner, Dal Ho Kim

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

In Bayesian statistics, a model can be assessed by checking that the model fits the data, which is addressed by using the posterior predictive distribution for a discrepancy, an extension of classical test statistics to allow dependence on unknown (nuisance) parameters. Posterior predictive assessment of model fitness allows more direct assessment of the discrepancy between data and the posited model. The sensitivity analysis revealed that the effect of priors on parameter inferences is different from their effect on marginal density and predictive posterior distribution. In this paper, we explore the effect of the prior (or posterior) distribution on the corresponding posterior predictive distribution. The approximate sensitivity of the posterior predictive distribution is studied in terms of information measure including the Kullback-Leibler divergence. As an illustration, we applied these results to the simple spatial model settings.

Original languageEnglish
Pages (from-to)261-270
Number of pages10
JournalJournal of the Korean Statistical Society
Volume44
Issue number2
DOIs
StatePublished - Jun 2015

Keywords

  • Bayesian sensitivity
  • Kullback-Leibler divergence
  • Laplace approximation
  • Posterior predictive distribution

Fingerprint

Dive into the research topics of 'Approximated sensitivity analysis in posterior predictive distribution'. Together they form a unique fingerprint.

Cite this