Ole Winther, Associate Professor, Technical University of Denmark
Edgeworth expansions for improved inference in Gaussian latent, variable models
Tuesday 7.12.2010 at 14.15
Location: BECS, F-building, 3rd floor coffee room, Rakentajanaukio 2, Otaniemi, Espoo
Ole Winther is Associate Professor at Intelligent Signal Processing, Informatics and Mathematical
Modelling (IMM), Technical University of Denmark
(DTU) and joint group leader of Gene Regulation, Bioinformatics,
University of Copenhagen. His research interests include machine learning,
approximate Bayesian inference with deterministic methods such as Expectation Propagation,
bioinformatics with specific focus on promoter analysis and gene regulation,
classification with Gaussian processes, independent component analysis (ICA),
low rank matrix factorization for collaborative filtering, non-parametric Bayes,
sparse linear latent variable models and non-linear state space models.
Abstract of the talk:
Reliable estimation of marginal likelihoods, predictive or marginal distributions and moments are crucial for the practical application of Bayesian inference. Expectation propagation (EP) is part of a rich family of variational methods, which approximate the sums and integrals required for exact probabilistic inference by an optimization problem. EP has proven to be especially well-suited for Gaussian latent variable models, as exemplified by the high accuracy obtained in classification with Gaussian processes.
EP can be considered as the zeroth order approximation in a specific cumulant (Edgeworth) expansion, and has the pleasing property amongst variational methods that at the stationary point of the marginal likelihood approximation, it is exact up to the second order cumulants. In the the talk I will show how to systematically compute higher order cumulant corrections to EP approximations of marginal likelihoods and moments. The corrections incorporate the remaining non-Gaussian cumulants that are neglected when tractable approximations to latent Gaussian models are made. For two types of Gaussian latent variable models, a Gaussian processes classification model and a binary Bayes network with quadratic interactions (an Ising model), we show how the EP approximation can be significantly improved. Finally, I will discuss how the structure of the variational approximation can be chosen within different tractable families to give factorized and tree-structured approximations.
Joint work with Ulrich Paquet and Manfred Opper.