Parallel Multiple Proposal MCMC Algorithms

We explore the variance reduction achievable through parallel implementation of multi-proposal MCMC algorithms and use of control variates. Implemented sequentially multi-proposal MCMC algorithms are of limited value, but they are very well suited for parallelization. Further, discarding the rejecte...

Full description

Bibliographic Details
Main Author: Austad, Haakon Michael
Format: Others
Language:English
Published: Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag 2007
Subjects:
Online Access:http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-12857
id ndltd-UPSALLA1-oai-DiVA.org-ntnu-12857
record_format oai_dc
spelling ndltd-UPSALLA1-oai-DiVA.org-ntnu-128572013-01-08T13:31:19ZParallel Multiple Proposal MCMC AlgorithmsengAustad, Haakon MichaelNorges teknisk-naturvitenskapelige universitet, Institutt for matematiske fagInstitutt for matematiske fag2007ntnudaim:3425SIF3 fysikk og matematikkIndustriell matematikkWe explore the variance reduction achievable through parallel implementation of multi-proposal MCMC algorithms and use of control variates. Implemented sequentially multi-proposal MCMC algorithms are of limited value, but they are very well suited for parallelization. Further, discarding the rejected states in an MCMC sampler can intuitively be interpreted as a waste of information. This becomes even more true for a multi-proposal algorithm where we discard several states in each iteration. By creating an alternative estimator consisting of a linear combination of the traditional sample mean and zero mean random variables called control variates we can improve on the traditional estimator. We present a setting for the multi-proposal MCMC algorithm and study it in two examples. The first example considers sampling from a simple Gaussian distribution, while for the second we design the framework for a multi-proposal mode jumping algorithm for sampling from a distribution with several separated modes. We find that the variance reduction achieved from our control variate estimator in general increases as the number of proposals in our sampler increase. For our Gaussian example we find that the benefit from parallelization is small, and that little is gained from increasing the number of proposals. The mode jumping example however is very well suited for parallelization and we get a relative variance reduction pr time of roughly 80% with 16 proposals in each iteration. Student thesisinfo:eu-repo/semantics/bachelorThesistexthttp://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-12857Local ntnudaim:3425application/pdfinfo:eu-repo/semantics/openAccess
collection NDLTD
language English
format Others
sources NDLTD
topic ntnudaim:3425
SIF3 fysikk og matematikk
Industriell matematikk
spellingShingle ntnudaim:3425
SIF3 fysikk og matematikk
Industriell matematikk
Austad, Haakon Michael
Parallel Multiple Proposal MCMC Algorithms
description We explore the variance reduction achievable through parallel implementation of multi-proposal MCMC algorithms and use of control variates. Implemented sequentially multi-proposal MCMC algorithms are of limited value, but they are very well suited for parallelization. Further, discarding the rejected states in an MCMC sampler can intuitively be interpreted as a waste of information. This becomes even more true for a multi-proposal algorithm where we discard several states in each iteration. By creating an alternative estimator consisting of a linear combination of the traditional sample mean and zero mean random variables called control variates we can improve on the traditional estimator. We present a setting for the multi-proposal MCMC algorithm and study it in two examples. The first example considers sampling from a simple Gaussian distribution, while for the second we design the framework for a multi-proposal mode jumping algorithm for sampling from a distribution with several separated modes. We find that the variance reduction achieved from our control variate estimator in general increases as the number of proposals in our sampler increase. For our Gaussian example we find that the benefit from parallelization is small, and that little is gained from increasing the number of proposals. The mode jumping example however is very well suited for parallelization and we get a relative variance reduction pr time of roughly 80% with 16 proposals in each iteration.
author Austad, Haakon Michael
author_facet Austad, Haakon Michael
author_sort Austad, Haakon Michael
title Parallel Multiple Proposal MCMC Algorithms
title_short Parallel Multiple Proposal MCMC Algorithms
title_full Parallel Multiple Proposal MCMC Algorithms
title_fullStr Parallel Multiple Proposal MCMC Algorithms
title_full_unstemmed Parallel Multiple Proposal MCMC Algorithms
title_sort parallel multiple proposal mcmc algorithms
publisher Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag
publishDate 2007
url http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-12857
work_keys_str_mv AT austadhaakonmichael parallelmultipleproposalmcmcalgorithms
_version_ 1716522662169149440