Robust generalised Bayesian inference for intractable likelihoods

T Matsubara, J Knoblauch, FX Briol… - Journal of the Royal …, 2022 - academic.oup.com
Journal of the Royal Statistical Society Series B: Statistical …, 2022academic.oup.com
Generalised Bayesian inference updates prior beliefs using a loss function, rather than a
likelihood, and can therefore be used to confer robustness against possible mis-
specification of the likelihood. Here we consider generalised Bayesian inference with a
Stein discrepancy as a loss function, motivated by applications in which the likelihood
contains an intractable normalisation constant. In this context, the Stein discrepancy
circumvents evaluation of the normalisation constant and produces generalised posteriors …
Abstract
Generalised Bayesian inference updates prior beliefs using a loss function, rather than a likelihood, and can therefore be used to confer robustness against possible mis-specification of the likelihood. Here we consider generalised Bayesian inference with a Stein discrepancy as a loss function, motivated by applications in which the likelihood contains an intractable normalisation constant. In this context, the Stein discrepancy circumvents evaluation of the normalisation constant and produces generalised posteriors that are either closed form or accessible using the standard Markov chain Monte Carlo. On a theoretical level, we show consistency, asymptotic normality, and bias-robustness of the generalised posterior, highlighting how these properties are impacted by the choice of Stein discrepancy. Then, we provide numerical experiments on a range of intractable distributions, including applications to kernel-based exponential family models and non-Gaussian graphical models.
Oxford University Press
以上显示的是最相近的搜索结果。 查看全部搜索结果

Google学术搜索按钮

example.edu/paper.pdf
查找
获取 PDF 文件
引用
References