Diversity-Driven Synthesis: Enhancing Dataset Distillation through Directed Weight Adjustment

Published: 25 Sept 2024, Last Modified: 19 Nov 2024NeurIPS 2024 spotlightEveryoneRevisionsBibTeXCC BY-NC-ND 4.0
Keywords: Dataset Distillation, Synthetic Data, Diversity, Generatlization
Abstract: The sharp increase in data-related expenses has motivated research into condensing datasets while retaining the most informative features. Dataset distillation has thus recently come to the fore. This paradigm generates synthetic datasets that are representative enough to replace the original dataset in training a neural network. To avoid redundancy in these synthetic datasets, it is crucial that each element contains unique features and remains diverse from others during the synthesis stage. In this paper, we provide a thorough theoretical and empirical analysis of diversity within synthesized datasets. We argue that enhancing diversity can improve the parallelizable yet isolated synthesizing approach. Specifically, we introduce a novel method that employs dynamic and directed weight adjustment techniques to modulate the synthesis process, thereby maximizing the representativeness and diversity of each synthetic instance. Our method ensures that each batch of synthetic data mirrors the characteristics of a large, varying subset of the original dataset. Extensive experiments across multiple datasets, including CIFAR, Tiny-ImageNet, and ImageNet-1K, demonstrate the superior performance of our method, highlighting its effectiveness in producing diverse and representative synthetic datasets with minimal computational expense. Our code is available at https://github.com/AngusDujw/Diversity-Driven-Synthesis.
Primary Area: Optimization (convex and non-convex, discrete, stochastic, robust)
Submission Number: 1946
Loading