A Review of Dataset Distillation for Deep LearningOpen Website

15 Nov 2022 (modified: 15 Nov 2022)OpenReview Archive Direct UploadReaders: Everyone
Abstract: Deep learning brings successful applications to solve learning tasks, such as image classification, face detection, text classification, and video recognition. However, deep learning models require training on vast training data and the use of prominent feature-extracted data to improve the performance of these tasks. This requires high computational and resource costs. Dataset distillation is one of the exciting and potential concepts to address this challenge, which has been an attractive research initiative in recent years. This paper discusses the current approach of utilizing dataset distillation for deep learning. In particular, we first elaborate on several dataset distillation methods for producing distilled datasets. Afterward, we provide a summary of the dataset distillation-based solutions to deep learning tasks that have been presented in the major machine learning conferences in recent years. Additionally, the strengths of each case are also discussed. Our review serves as insights to the latest usage of dataset distillation for several deep learning tasks.
0 Replies

Loading