Variational Bayesian inference for a non-linear forward model

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors M. Chappell, A. Groves, Brandon Whitcher, M. Woolrich
Journal/Conference Name I
Paper Category
Paper Abstract Variational Bayes (VB) has been proposed as a method to facilitate calculations of the posterior distributions for linear models, by providing a fast method for Bayesian inference by estimating the parameters of a factorized approximation to the posterior distribution. Here a VB method for nonlinear forward models with Gaussian additive noise is presented. In the case of noninformative priors the parameter estimates obtained from this VB approach are identical to those found via nonlinear least squares. However, the advantage of the VB method lies in its Bayesian formulation, which permits prior information to be included in a hierarchical structure and measures of uncertainty for all parameter estimates to be obtained via the posterior distribution. Unlike other Bayesian methods VB is only approximate in comparison with the sampling method of MCMC. However, the VB method is found to be comparable and the assumptions made about the form of the posterior distribution reasonable. Practically, the VB approach is substantially faster than MCMC as fewer calculations are required. Some of the advantages of the fully Bayesian nature of the method are demonstrated through the extension of the noise model and the inclusion of automatic relevance determination (ARD) within the VB algorithm.
Date of publication 2009
Code Programming Language Shell
Comment

Copyright Researcher 2022