State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes

Jul 12, 2020

Speakers

About

We formulate expectation propagation (EP), a state-of-the-art method for approximate Bayesian inference, as a nonlinear Kalman smoother, showing that it generalises a wide class of classical smoothing algorithms. Specifically we show how power EP recovers the Extended and Unscented Kalman smoothers, with the distinction between the two being the choice of method for performing moment matching. EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and by allowing fractional updates. We combine these benefits with the computational efficiency of Kalman smoothing, and provide extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework. The resulting schemes enable inference in Gaussian process models in linear time complexity in the number of data, making them ideal for large temporal and spatio-temporal scenarios. Our results show that an extension of the Extended Kalman filter in which the linearisations are iteratively refined via EP-style updates is both efficient and performant, whilst its ease of implementation makes it a convenient plug-and-play approach to many non-conjugate regression and classification problems.

Organizer

Categories

About ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2020