SyNC: Balancing Fidelity and Diversity of Synthetic Data Representations in CLIP-based Few-Shot Learning via Neural Collapse

15 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: few-shot learning, synthetic data, neural collapse
TL;DR: We propose a novel training algorithm for few-shot learning with synthetic data, balancing the model representation fidelity and diversity.
Abstract: In few-shot learning, augmenting real data with synthesized images from text-to-image diffusion models has emerged as a promising direction. Although numerous studies have been proposed to improve the performance of this training framework, they often fail to adequately address the critical trade-off between fidelity and diversity when training with synthetic data. In this work, we propose SyNC, a novel training paradigm that explicitly balances these characteristics in the feature space through two complementary mechanisms. First, we leverage an optimal geometric prototype structure built upon the Neural Collapse phenomenon to increase fidelity, guiding the representations of both real and synthetic data toward their corresponding equiangular tight frame (ETF) prototypes. Second, we introduce an innovative regional contrastive loss function specifically designed to enhance diversity by improving the distinction between misclassified synthetic data features, thereby encouraging more varied and robust representations. Extensive experimental results demonstrate the effectiveness of our proposed method, which outperforms state-of-the-art approaches on average across few-shot image classification benchmarks and shows significant improvements on fine-grained datasets. Further analysis demonstrates that our method achieves a more favorable balance between representation fidelity and diversity, revealing a correlation between these factors and overall model performance.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 5689
Loading