Seminar: Frank Hutter - University of Freiburg

Date November 11, 2021
Author Hrvoje Stojic

Towards Deep Learning 2.0: Going to the Meta-Level

Abstract

Deep Learning has been incredibly successful, due to its ability to automatically learn useful representations from raw data. The next logical steps are to also (1) automatically learn the architectures & hyperparameters which allow deep learning to be successful and (2) take objectives other than accuracy as input and automatically optimize for these. For (1), the main issue is performance, and in this talk I will discuss several speedup methods for Bayesian optimization (integrating user beliefs and multi-fidelity meta-learning) and their application to joint neural architecture search and hyperparameter optimization. For (2), in this talk, I will discuss two different new methods for improved uncertainty quantification: neural ensemble search and meta-learning Bayesian inference.

Notes

  • Frank Hutter is a Professor of Computer Science at the University of Freiburg. Personal website can be found here .
  • References:
    • Auto-Pytorch: Multi-Fidelity MetaLearning for Efficient and Robust AutoDL (https://ieeexplore.ieee.org/document/9382913)
    • Well-tuned Simple Nets Excel on Tabular Datasets (https://openreview.net/pdf?id=d3k38LTDCyO)
    • Neural Ensemble Search for Uncertainty Estimation and Dataset Shift (https://openreview.net/forum?id=HiYDAwAGWud)
Share
,,

Related articles

Seminar: Carl Henrik Ek - University of Cambridge

Seminar: Andrew G. Wilson - New York University

Seminar: Frank Hutter - University of Freiburg

Seminar: Javier González Hernández - Microsoft Research Cambridge

Optimization Engine
    Learn more
Solutions
Insights
Company
Research
©2024 Secondmind Ltd.