![](https://cdn.sanity.io/images/2n305zeh/production/a80a21af793b158f6ad1f391068189a777b5ac0e-4896x3264.jpg?rect=0,527,4896,2210&w=1440&h=650&q=75&fit=max&auto=format)
Dual parameterization of sparse variational Gaussian processes
Date December 6, 2021
Authors Vincent Adam, Paul Chang, Mohammad Emtiyaz E Khan, Arno Solin
Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits. In this paper, we improve their computational efficiency by using a dual parameterization where each data example is assigned dual parameters, similarly to site parameters used in expectation propagation.
Our dual parameterization speeds-up inference using natural gradient descent, and provides a tighter evidence lower bound for hyperparameter learning. The approach has the same memory cost as the current SVGP methods, but it is faster and more accurate.
Share
,,