Principles of Bayesian Inference Using General Divergence Criteria

When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the...

Full description

Bibliographic Details
Main Authors: Jack Jewson, Jim Q. Smith, Chris Holmes
Format: Article
Language:English
Published: MDPI AG 2018-06-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/20/6/442
id doaj-b7e1a54f50dc421b975575d4980ef613
record_format Article
spelling doaj-b7e1a54f50dc421b975575d4980ef6132020-11-24T23:31:45ZengMDPI AGEntropy1099-43002018-06-0120644210.3390/e20060442e20060442Principles of Bayesian Inference Using General Divergence CriteriaJack Jewson0Jim Q. Smith1Chris Holmes2Department of Statistics, University of Warwick, Coventry CV4 7AL, UKDepartment of Statistics, University of Warwick, Coventry CV4 7AL, UKDepartment of Statistics, University of Oxford, Oxford OX1 3LB, UKWhen it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes & Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker & Vidyashankar, 2014; Ghosh & Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models.http://www.mdpi.com/1099-4300/20/6/442Kullback–Leibler divergencerobustnessBayesian updatingminimum divergence estimationM-open inference
collection DOAJ
language English
format Article
sources DOAJ
author Jack Jewson
Jim Q. Smith
Chris Holmes
spellingShingle Jack Jewson
Jim Q. Smith
Chris Holmes
Principles of Bayesian Inference Using General Divergence Criteria
Entropy
Kullback–Leibler divergence
robustness
Bayesian updating
minimum divergence estimation
M-open inference
author_facet Jack Jewson
Jim Q. Smith
Chris Holmes
author_sort Jack Jewson
title Principles of Bayesian Inference Using General Divergence Criteria
title_short Principles of Bayesian Inference Using General Divergence Criteria
title_full Principles of Bayesian Inference Using General Divergence Criteria
title_fullStr Principles of Bayesian Inference Using General Divergence Criteria
title_full_unstemmed Principles of Bayesian Inference Using General Divergence Criteria
title_sort principles of bayesian inference using general divergence criteria
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2018-06-01
description When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes & Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker & Vidyashankar, 2014; Ghosh & Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models.
topic Kullback–Leibler divergence
robustness
Bayesian updating
minimum divergence estimation
M-open inference
url http://www.mdpi.com/1099-4300/20/6/442
work_keys_str_mv AT jackjewson principlesofbayesianinferenceusinggeneraldivergencecriteria
AT jimqsmith principlesofbayesianinferenceusinggeneraldivergencecriteria
AT chrisholmes principlesofbayesianinferenceusinggeneraldivergencecriteria
_version_ 1725536203599511552