
Tightening the gap between machine learning theory and value delivered to the real world.
About Secondmind Labs
Our origins lie in academic research, which is the foundation on which our products are built. By combining years of expertise in machine learning research with hands-on commercial experience, we are identifying and addressing the needs of our customers by building an Active Learning framework to meet some of the most complex model-based design and development challenges.
The mission of Secondmind Labs is to provide the machine learning expertise and the know-how that fuels our product, and to explore innovative ideas to address time pressures, increasing number of parameters and the software-defined car. Our main areas of expertise are probabilistic modeling and Bayesian optimization, the two foundational ingredients of Secondmind Active Learning. Our work in these areas covers a large spectrum that goes from theoretical contributions to developing practical tools: we are both known for our very strong scientific publication record and for the open source toolboxes that we make available to the world.
Our team is led by our Chief Science Officer, Carl Edward Rasmussen, Professor of Machine Learning at Cambridge University. Under his leadership, our team of researchers and engineers uses proven mathematical principles to build scalable tools that solve problems in a range of businesses and a variety of sectors.
Publications
To date, we have published over 70 papers in top machine learning journals and conferences. Most recently, our team achieved a 100% acceptance rate on the five papers we submitted to NeurIPS 2021. More importantly, the quality of the work that we are pushing forward has been recognised by three best paper awards: ICML in 2019 and AISTATS in 2020-2021.
Open source toolboxes
Open-sourcing our own machine learning frameworks is our way to contribute back to the community.
Secondmind has been the home of the GPflow project for a few years now, and we have open sourced our Bayesian optimization framework Trieste and our Deep Gaussian process library GPflux.
GPflow
GPflow has become the standard library for Gaussian process models in Python / Tensorflow. It covers classic GP regression models, but also the modern approaches based on variational inference and MCMC.


Latest from labs
Filter
Seminar: Francois-Xavier Briol - University College London

Deep neural networks as point estimates for deep Gaussian processes
Scalable Thompson Sampling using Sparse Gaussian Process Models
Marginalised Gaussian Processes with Nested Sampling
Dual parameterization of sparse variational Gaussian processes
Kernel Identification Through Transformers
Seminar: Dino Sejdinovic - University of Oxford

An empirical evaluation of active inference in multi-armed bandits
Seminar: François Bachoc - Toulouse Mathematics Institute
