Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

Date June 9, 2019
Authors Alessandro Davide Ialongo (University of Cambridge), Mark van der Wilk (Imperial College London), James Hensman, Carl Edward Rasmussen (University of Cambridge & Secondmind)

We identify a new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process. Inference in this setting has, so far, either employed computationally intensive MCMC methods, or relied on factorisations of the variational posterior. As we demonstrate in our experiments, the factorisation between latent system states and transition function can lead to a miscalibrated posterior and to learning unnecessarily large noise terms. We eliminate this factorisation by explicitly modelling the dependence between the states and the low-rank representation of our Gaussian process posterior. Samples of the latent states can then be tractably generated by conditioning on this representation. The method we obtain gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods.

View the paper

Share
,,
Optimization Engine
    Learn more
Solutions
Insights
Company
Research
©2024 Secondmind Ltd.