1

Tighter Bounds on the Log Marginal Likelihood of Gaussian Process Regression Using Conjugate Gradients

We propose a lower bound on the log marginal likelihood of Gaussian process regression models that can be computed without matrix factorisation of the full kernel matrix. We show that approximate maximum likelihood learning of model parameters by …

Deep Neural Networks as Point Estimates for Deep Gaussian Processes

Speedy Performance Estimation for Neural Architecture Search

Bayesian Image Classification with Deep Convolutional Gaussian Processes

In decision-making systems, it is important to have classifiers that have calibrated uncertainties, with an optimisation objective that can be used for automated model selection and training. Gaussian processes (GPs) provide uncertainty estimates and …

A Bayesian Perspective on Training Speed and Model Selection

Stochastic Segmentation Networks: Modelling Spatially Correlated Aleatoric Uncertainty

Variational Gaussian Process Models without Matrix Inverses

In this work, we provide a variational lower bound that can be computed without expensive matrix operations like inversion. Our bound can be used as a drop-in replacement to the existing variational method of Hensman et al. (2013, 2015), and can …

Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

We identify a new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process. Inference in this setting has either employed computationally intensive MCMC methods, or relied on factorisations of the …

Rates of Convergence for Sparse Variational Gaussian Process Regression

Excellent variational approximations to Gaussian process posteriors have been developed which avoid the $mathcalOłeft(N^3i̊ght)$ scaling with dataset size $N$. They reduce the computational cost to $mathcalOłeft(NM^2g̊ht)$, with $Młl N$ the number of …

Bayesian Layers: A Module for Neural Network Uncertainty