Seminar: Pablo Moreno-Muñoz - Technical University of Denmark

Date June 23, 2022
Author Hrvoje Stojic

Model Recycling with Gaussian Processes


Gaussian processes are well known for their modelling flexibility and robustness to overfitting, but also for their prohibitive computational cost in training and prediction. In practice, modern sparse approximations circumvent this problem via inducing inputs and variational inference. However, the utility of inducing points goes beyond the scalability problem. In this talk, I will present a new framework for transfer learning based on the idea of model recycling. Crucially, the use of inducing points and variational posteriors makes it possible to perform two intertwined tasks between pre-trained GP models: i) building meta-models from models and ii) updating models with new data, without revisiting any sample. The framework avoids undesired data centralisation, reduces computational cost and allows the transfer of uncertainty metrics after training. The method exploits the augmentation of high-dimensional integral operators based on the Kullback-Leibler divergence between stochastic processes. This introduces efficient lower bounds under all pre-trained sparse variational GPs, with different complexity and even likelihood model. In the talk, I will also show experimental results for two scenarios, one for building meta-models given a collection of already-fitted GPs and another for continual inference given streaming data.


  • References:
    • P. Moreno-Muñoz, A. Artés-Rodríguez and M. A. Álvarez. Modular Gaussian Processes for Transfer Learning, In Advances in Neural Information Processing Systems (NeurIPS), 2021
    • P. Moreno-Muñoz, A. Artés-Rodríguez and M. A. Álvarez. Continual Multi-task Gaussian Processes, In arXiv:1911.00002, 2019
  • Personal website can be found here .

Related articles

Seminar: Arno Solin - Aalto University

Seminar: Vincent Adam - Secondmind & Aalto University

Seminar: Laurence Aitchison - University of Bristol

Seminar: Arthur Gretton - University College London

    Learn more
©2022 Secondmind Ltd.