IPAB Workshop - 28/11/19

 

Konda Reddy Mopuri

 

Title: Zero-Shot Knowledge Distillation in Deep Neural Networks

 

Abstract: Knowledge distillation deals with the problem of training a smaller model (Student) from a high capacity source model (Teacher) so as to retain most of its performance. Existing approaches use either the training data or meta-data extracted from it in order to train the Student. However, accessing the dataset on which the Teacher has been trained may not always be feasible if the dataset is very large or it poses privacy or safety concerns (e.g., bio-metric or medical data). Along with other interesting directions to achieve zero-shot knowledge transfer, I will briefly discuss our recent work that uses easy to synthesise Data Impressions from the Teacher model.

 

 

Kunkun Pang

 

Title: Dynamic Ensemble Active Learning: A Non-Stationary Bandit with Expert Advice

 

Abstract:Active learning aims to reduce annotation cost by predicting which samples are useful for a human teacher to label. However it has become clear there is no best active learning algorithm. Inspired by various philosophies about what constitutes a good criteria, different algorithms perform well on different datasets. This has motivated research into ensembles of active learners that learn what constitutes a good criteria in a given scenario, typically via multi-armed bandit algorithms. Though algorithm ensembles can lead to better results, they overlook the fact that not only does algorithm efficacy vary across datasets, but also during a single active learning session. That is, the best criteria is non-stationary. This breaks existing algorithms' guarantees and hampers their performance in practice. In this paper, we propose dynamic ensemble active learning as a more general and promising research direction. We develop a  dynamic ensemble active learner based on a non-stationary multi-armed bandit with expert advice algorithm. Our dynamic ensemble selects the right criteria at each step of active learning. It has theoretical guarantees, and shows encouraging results on 13 popular datasets.

Nov 28 2019 -

IPAB Workshop - 28/11/19

Konda Reddy Mopuri, Kunkun Pang

IF, G.03