Unbiased active learning and testing

日付:

2022年9月16日

著者:

Hrvoje Stojic



Abstract

Active learning is a powerful tool when labelling data is expensive, but it introduces a bias because the training data no longer follows the population distribution. We can, in fact, remove this bias using corrective weights based on importance sampling. This has two main consequences: first, we show that the bias is actually useful for active learning, especially with overparameterized models like neural networks; second, this technique enables active testing---a new way of doing model evaluation with limited data.


Notes


ソーシャルメディアで共有

ソーシャルメディアで共有

ソーシャルメディアで共有

関連するセミナー

Linear combinations of latents in generative models: subspaces and beyond

Erik Bodin - University of Cambridge

2025/03/13

Linear combinations of latents in generative models: subspaces and beyond

Erik Bodin - University of Cambridge

2025/03/13

Return of the latent space cowboys: rethinking the use of VAEs in Bayesian optimisation over structured spaces

Henry Moss - University of Cambridge, Lancaster University

2025/01/21

Return of the latent space cowboys: rethinking the use of VAEs in Bayesian optimisation over structured spaces

Henry Moss - University of Cambridge, Lancaster University

2025/01/21

Advancing sequential decision-making: efficient querying in clustering and best of both worlds for contextual bandits

Yuko Kuroki - CENTAI Institute

2024/10/10

Advancing sequential decision-making: efficient querying in clustering and best of both worlds for contextual bandits

Yuko Kuroki - CENTAI Institute

2024/10/10

AI in drug discovery - from model to process, from academic publication to decision-making

Andreas Bender - University of Cambridge

2024/09/19

AI in drug discovery - from model to process, from academic publication to decision-making

Andreas Bender - University of Cambridge

2024/09/19