1

Data augmentation in Bayesian neural networks and the cold posterior effect

Learning invariant weights in neural networks

Assumptions about invariances or symmetries in data can significantly increase the predictive power of statistical models. Many commonly used machine learning models are constraint to respect certain symmetries, such as translation equivariance in …

Bayesian Neural Network Priors Revisited

Last Layer Marginal Likelihood for Invariance Learning

Correlated weights in infinite limits of deep convolutional neural networks

Infinite width limits of deep neural networks often have tractable forms. They have been used to analyse the behaviour of finite networks, as well as being useful methods in their own right. When investigating infinitely wide convolutional neural …

The promises and pitfalls of deep kernel learning

Deep kernel learning and related techniques promise to combine the representational power of neural networks with the reliable uncertainty estimates of Gaussian processes. One crucial aspect of these models is an expectation that, because they are …

Tighter Bounds on the Log Marginal Likelihood of Gaussian Process Regression Using Conjugate Gradients

We propose a lower bound on the log marginal likelihood of Gaussian process regression models that can be computed without matrix factorisation of the full kernel matrix. We show that approximate maximum likelihood learning of model parameters by …

Deep Neural Networks as Point Estimates for Deep Gaussian Processes

Speedy Performance Estimation for Neural Architecture Search

Bayesian Image Classification with Deep Convolutional Gaussian Processes

In decision-making systems, it is important to have classifiers that have calibrated uncertainties, with an optimisation objective that can be used for automated model selection and training. Gaussian processes (GPs) provide uncertainty estimates and …