Robust Bayesian Regression with Synthetic Posterior Distributions

Although linear regression models are fundamental tools in statistical science, the estimation results can be sensitive to outliers. While several robust methods have been proposed in frequentist frameworks, statistical inference is not necessarily straightforward. We here propose a Bayesian approac...

Full description

Bibliographic Details
Main Authors: Shintaro Hashimoto, Shonosuke Sugasawa
Format: Article
Language:English
Published: MDPI AG 2020-06-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/22/6/661
Description
Summary:Although linear regression models are fundamental tools in statistical science, the estimation results can be sensitive to outliers. While several robust methods have been proposed in frequentist frameworks, statistical inference is not necessarily straightforward. We here propose a Bayesian approach to robust inference on linear regression models using synthetic posterior distributions based on <inline-formula> <math display="inline"> <semantics> <mi>γ</mi> </semantics> </math> </inline-formula>-divergence, which enables us to naturally assess the uncertainty of the estimation through the posterior distribution. We also consider the use of shrinkage priors for the regression coefficients to carry out robust Bayesian variable selection and estimation simultaneously. We develop an efficient posterior computation algorithm by adopting the Bayesian bootstrap within Gibbs sampling. The performance of the proposed method is illustrated through simulation studies and applications to famous datasets.
ISSN:1099-4300