Seminar: Philipp Hennig - University of Tuebingen

Date September 24, 2020
Author Hrvoje Stojic

Computation under Uncertainty

Abstract

Have you ever wondered why you’re not using BFGS to train your deep network, and why you don’t use conjugate gradients to do GP inference? The advent of data as a first-class citizen of computer programs has changed the nature of computation. The computations of machine learning (which are largely numerical operations like optimization, integration and linear algebra) are characterised by stochasticity and imprecision, caused both by the incompleteness of data and its overabundance. Some of the paradigms of classic numerical computation are invalidated in this regime, alongside with the algorithms they engendered. In this talk I will argue that, wherever data becomes an integral part of the computational tasks, probabilistic reasoning has to become a central part of the computational methods. This involves not just changing how we interpret and use the numbers computed by GPUs and CPUs, but also re-evaluating *which* numbers they should compute in the first place. While I will advocate for the conceptual framework of probabilistic numerics to answer such questions, I will also offer some concrete software tools that might help you build more reliable algorithms for machine learning regardless of your philosophical convictions.

Notes

  • Philipp Hennig is a Professor for the Methods of Machine Learning at the University of Tuebingen, and Adjunct scientist at the Max Planck Institute for Intelligent Systems in Tuebingen. His website can be found here .
Share
,,

Related articles

Seminar: Arno Solin - Aalto University

Seminar: Magnus Rattray - University of Manchester

Seminar: Andreas Krause - ETH Zurich

Seminar: Alexandra Gessner - University of Tuebingen

Optimization Engine
    Learn more
Solutions
Insights
Company
Research
©2024 Secondmind Ltd.