Deep Gaussian processes and variational propagation of uncertainty
Uncertainty propagation across components of complex probabilistic models is vital for improving regularisation. Unfortunately, for many interesting models based on non-linear Gaussian processes (GPs), straightforward propagation of uncertainty is computationally and mathematically intractable. This...
Main Author: | |
---|---|
Other Authors: | |
Published: |
University of Sheffield
2015
|
Subjects: | |
Online Access: | http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.665042 |
id |
ndltd-bl.uk-oai-ethos.bl.uk-665042 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-bl.uk-oai-ethos.bl.uk-6650422017-10-04T03:25:49ZDeep Gaussian processes and variational propagation of uncertaintyDamianou, AndreasLawrence, Neil2015Uncertainty propagation across components of complex probabilistic models is vital for improving regularisation. Unfortunately, for many interesting models based on non-linear Gaussian processes (GPs), straightforward propagation of uncertainty is computationally and mathematically intractable. This thesis is concerned with solving this problem through developing novel variational inference approaches. From a modelling perspective, a key contribution of the thesis is the development of deep Gaussian processes (deep GPs). Deep GPs generalise several interesting GP-based models and, hence, motivate the development of uncertainty propagation techniques. In a deep GP, each layer is modelled as the output of a multivariate GP, whose inputs are governed by another GP. The resulting model is no longer a GP but, instead, can learn much more complex interactions between data. In contrast to other deep models, all the uncertainty in parameters and latent variables is marginalised out and both supervised and unsupervised learning is handled. Two important special cases of a deep GP can equivalently be seen as its building components and, historically, were developed as such. Firstly, the variational GP-LVM is concerned with propagating uncertainty in Gaussian process latent variable models. Any observed inputs (e.g. temporal) can also be used to correlate the latent space posteriors. Secondly, this thesis develops manifold relevance determination (MRD) which considers a common latent space for multiple views. An adapted variational framework allows for strong model regularisation, resulting in rich latent space representations to be learned. The developed models are also equipped with algorithms that maximise the information communicated between their different stages using uncertainty propagation, to achieve improved learning when partially observed values are present. The developed methods are demonstrated in experiments with simulated and real data. The results show that the developed variational methodologies improve practical applicability by enabling automatic capacity control in the models, even when data are scarce.519.2University of Sheffieldhttp://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.665042http://etheses.whiterose.ac.uk/9968/Electronic Thesis or Dissertation |
collection |
NDLTD |
sources |
NDLTD |
topic |
519.2 |
spellingShingle |
519.2 Damianou, Andreas Deep Gaussian processes and variational propagation of uncertainty |
description |
Uncertainty propagation across components of complex probabilistic models is vital for improving regularisation. Unfortunately, for many interesting models based on non-linear Gaussian processes (GPs), straightforward propagation of uncertainty is computationally and mathematically intractable. This thesis is concerned with solving this problem through developing novel variational inference approaches. From a modelling perspective, a key contribution of the thesis is the development of deep Gaussian processes (deep GPs). Deep GPs generalise several interesting GP-based models and, hence, motivate the development of uncertainty propagation techniques. In a deep GP, each layer is modelled as the output of a multivariate GP, whose inputs are governed by another GP. The resulting model is no longer a GP but, instead, can learn much more complex interactions between data. In contrast to other deep models, all the uncertainty in parameters and latent variables is marginalised out and both supervised and unsupervised learning is handled. Two important special cases of a deep GP can equivalently be seen as its building components and, historically, were developed as such. Firstly, the variational GP-LVM is concerned with propagating uncertainty in Gaussian process latent variable models. Any observed inputs (e.g. temporal) can also be used to correlate the latent space posteriors. Secondly, this thesis develops manifold relevance determination (MRD) which considers a common latent space for multiple views. An adapted variational framework allows for strong model regularisation, resulting in rich latent space representations to be learned. The developed models are also equipped with algorithms that maximise the information communicated between their different stages using uncertainty propagation, to achieve improved learning when partially observed values are present. The developed methods are demonstrated in experiments with simulated and real data. The results show that the developed variational methodologies improve practical applicability by enabling automatic capacity control in the models, even when data are scarce. |
author2 |
Lawrence, Neil |
author_facet |
Lawrence, Neil Damianou, Andreas |
author |
Damianou, Andreas |
author_sort |
Damianou, Andreas |
title |
Deep Gaussian processes and variational propagation of uncertainty |
title_short |
Deep Gaussian processes and variational propagation of uncertainty |
title_full |
Deep Gaussian processes and variational propagation of uncertainty |
title_fullStr |
Deep Gaussian processes and variational propagation of uncertainty |
title_full_unstemmed |
Deep Gaussian processes and variational propagation of uncertainty |
title_sort |
deep gaussian processes and variational propagation of uncertainty |
publisher |
University of Sheffield |
publishDate |
2015 |
url |
http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.665042 |
work_keys_str_mv |
AT damianouandreas deepgaussianprocessesandvariationalpropagationofuncertainty |
_version_ |
1718543794005082112 |