Mark van der Wilk

Mark van der Wilk

Associate Professor in Machine Learning

University of Oxford

Research

I am an Associate Professor in the Department of Computer Science at the University of Oxford, researching machine learning, and a Tutorial Fellow at Hertford College.

Together with my research group, I work on three central questions:

  • How do we find general patterns that allow generalization beyond the training set, without humans manually encoding them? (Equivariance, causality, continual learning…)
  • How can we create neurons that automatically assemble their connectivity structure (architecture), while minimising the computational costs of the network as a whole? (Generalisation bounds, Bayesian model selection, MDL, meta-learning)
  • How do we interact with the environment, while avoiding risk but learning as quickly as possible? (Bayesian optimisation, foundation models for industrial applications e.g. chemistry)

These improvements are relevant for both small-data statistics (Gaussian processes), and large-data machine learning (neural networks). A wide range of research topics contribute towards these improvements, such as invariance, Bayesian inference, causality, meta-learning, local learning rules and generative modelling. My work has been presented at the leading machine learning conferences (e.g. NeurIPS and ICML), and includes a best paper award.

See my Research Overview page for more details on my research interests.

my alt text
Figure: Learning rotationally equivariant filter structure starting from a fully-connected structure. Note how the rotation of the filters increases as it learns the equivariant structure [paper].

Working with me

I will have spaces for a small number (~2) of PhD students a year in the next few years. I am looking for people with a strong academic background (particularly strong mathematical skills) who are keen to work on topics that are aligned with my interests. I have written up some tips and guidelines for applying, which I recommend you read before getting in touch or submitting your application.

Academic or Industrial Collaborations

I am also interested in applied problems, and am keen to collaborate. While my research overview gives a more complete picture of topics, I wanted to give a special mention to problems where 1) signal needs to be distinguished from noise, 2) knowledge needs to be encoded into the model, or 3) data is scarce, or needs to be acquired intelligently. Tools like (deep) Gaussian processes can make a difference here, and recent developments have provided new capabilities of dealing with higher-dimensional inputs or large datasets. Current ongoing collaborations include tailored Bayesian optimisation models for biomolecular design or optimisation of chemical processes.

If you have a problem that fits these descriptions, please do get in touch. Collaborations can range from publishing case-studies or datasets which can serve as a community benchmark, to consulting, to methods research.

About

Before starting at Oxford, I spent several great years at Imperial College London as a senior lecturer (associate professor). Before that, I worked in industry with Dr James Hensman for two years on data-driven decision making. I did my PhD in the Machine Learning Group at the University of Cambridge, working with Prof. Carl Rasmussen, and completing my thesis in 2017. I was funded by the EPSRC and awarded a Qualcomm Innovation Fellowship for my final year. During my PhD, I occasionally worked as a machine learning consultant, and I also spent a few months as a visiting researcher at Google in Mountain View, CA. I moved to the UK from the Netherlands for my undergraduate degree in Engineering.

Recent & Upcoming Talks

Recent Publications

(2022). Invariance Learning in Deep Neural Networks with Differentiable Laplace Approximations. Advances in Neural Information Processing Systems (NeurIPS).

PDF Cite DOI

(2022). Memory Safe Computations with XLA Compiler. Advances in Neural Information Processing Systems (NeurIPS).

PDF Cite DOI

(2022). Recommendations for Baselines and Benchmarking Approximate Gaussian Processes. NeurIPS Workshop on Gaussian Processes, Spatiotemporal Modeling, and Decision-making Systems.

PDF Cite

(2022). Relaxing Equivariance Constraints with Non-stationary Continuous Filters. Advances in Neural Information Processing Systems (NeurIPS).

PDF Cite DOI

(2022). SnAKe: Bayesian Optimization with Pathwise Exploration. Advances in Neural Information Processing Systems (NeurIPS).

PDF Cite DOI