Keywords: Imbalanced Learning, Small-Data Learning, Curriculum Learning, Bilevel Optimization, Regret Minimization, Robust Generalization
Abstract: Imbalanced and small-data regimes are pervasive in domains such as rare disease imaging, genomics, and disaster response, where labeled samples are scarce and naive augmentation often introduces artifacts. Existing solutions—such as over-sampling, focal loss, or meta-weighting—address isolated aspects of this challenge but remain fragile or complex. We introduce FOSSIL (Flexible Optimiza-
tion via Sample-Sensitive Importance Learning), a unified weighting framework that seamlessly integrates class imbalance correction, difficulty-aware curricula, augmentation penalties, and warmup dynamics into a single interpretable formula. Unlike prior heuristics, the proposed framework provides regret-based theoretical guarantees and achieves consistent empirical gains over ERM, curriculum, and meta-weighting baselines on synthetic and real-world datasets, while requiring no architectural changes.
Primary Area: optimization
Submission Number: 19777
Loading