2005 VariationalMessagePassing

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Variational Message Passing, Bayesian Inference.

Notes

Cited By

~132 http://scholar.google.com/scholar?cites=2301125952311798852

Quotes

Abstract

  • Bayesian inference is now widely established as one of the principal foundations for machine learning. In practice, exact inference is rarely possible, and so a variety of approximation techniques have been developed, one of the most widely used being a deterministic framework called variational inference. In this paper we introduce Variational Message Passing (VMP), a general purpose algorithm for applying variational inference to Bayesian Networks. Like belief propagation, VMP proceeds by sending messages between nodes in the network and updating posterior beliefs using local operations at each node. Each such update increases a lower bound on the log evidence (unless already at a local maximum). In contrast to belief propagation, VMP can be applied to a very general class of conjugate-exponential models because it uses a factorised variational approximation. Furthermore, by introducing additional variational parameters, VMP can be applied to models containing non-conjugate distributions. The VMP framework also allows the lower bound to be evaluated, and this can be used both for model comparison and for detection of convergence. Variational message passing has been implemented in the form of a general purpose inference engine called VIBES ('Variational Inference for BayEsian networkS') which allows models to be specified graphically and then solved variationally without recourse to coding.

6.3 Learning Non-conjugate Priors by Sampling

  • The resulting hybrid variational/sampling framework would, to a certain extent, capture the advantages of both techniques.

7. Discussion

  • The variational message passing algorithm allows approximate inference using a factorised variational distribution in any conjugate-exponential model, and in a range of non-conjugate models. As a demonstration of its utility, this algorithm has already been used to solve problems in the domain of machine vision and bioinformatics (see Winn, 2003; Bishop and Winn, 2000). In general, variational message passing dramatically simplifies the construction and testing of new variational models and readily allows a range of alternative models to be tested on a given problem.

References


,

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2005 VariationalMessagePassingChristopher M. Bishop
John Winn
Variational Message Passinghttp://www.gatsby.ucl.ac.uk/~turner/workshop/GBPpapers/Winn 2004.pdf