IPAB Workshop-22/10/2020

 

 

Title: Dataset Condensation with Gradient Matching Abstract: As the state-of-the-art machine learning methods in many fields rely on larger datasets, storing them and training models on them becomes more expensive. We propose a training set synthesis technique for data-efficient learning, called Dataset Condensation, that learns to condense a large dataset into a small set of informative samples for training deep neural networks from scratch. We formulate this goal as a gradient matching problem between the gradients of a deep neural network trained on the original data and our synthetic data. We rigorously evaluate its performance in several computer vision benchmarks and demonstrate that it significantly outperforms the state-of-the-art methods. Finally we explore the use of our method in continual learning and neural architecture search and show that it achieves promising gains on a tight budget of memory and computations.

IPAB Workshop-22/10/2020

Bo Zhao

Blackboard Collaborate