Approximated information analysis in Bayesian inference

Jung In Seo, Yongku Kim

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback-Leibler divergence. As an illustration, we then apply these results to simple spatial model settings.

Original languageEnglish
Pages (from-to)1441-1451
Number of pages11
JournalEntropy
Volume17
Issue number3
DOIs
StatePublished - 2015

Keywords

  • Bayesian sensitivity
  • Gibbs sampler
  • Kullback-Leibler divergence
  • Laplace approximation

Fingerprint

Dive into the research topics of 'Approximated information analysis in Bayesian inference'. Together they form a unique fingerprint.

Cite this