Seminar: Vincent Adam - Secondmind & Aalto University

Date January 14, 2021
Author Hrvoje Stojic

Sparse methods for markovian GPs


Gaussian Processes (GP) provide rich priors for time series models. Markovian GPs with 1d input have an equivalent representation as stochastic differential equations (SDE) whose structure allows for the derivation of fast (approximate) inference algorithms. Their typical computational complexity scales linearly with the number of data points O(N), with computations inherently sequential. Using inducing states of this SDE to support a sparse GP approximation to the posterior process leads to further computational savings by making the O(N) scaling parallel. I will present various approximate inference algorithms based on this sparse approximation including Laplace, expectation-propagation and variational inference and I will discuss their performance guarantees and comparative advantages.


  • Vincent Adam is a Senior Machine Learning Researcher at Secondmind, and Postdoctoral researcher at Aalto University. His website can be found here .

Related articles

Seminar: Arno Solin - Aalto University

Seminar: Laurence Aitchison - University of Bristol

Seminar: Arthur Gretton - University College London

Seminar: Alexandra Gessner - University of Tuebingen