Reproduction of Baselines on Label-Distribution-Aware Margin Loss and Deferred Reweighting ScheduleDownload PDF

29 Dec 2019 (modified: 05 May 2023)NeurIPS 2019 Reproducibility Challenge Blind ReportReaders: Everyone
Abstract: Most state-of-the-arts classifiers assume a rela- tively balanced class distribution and equal mis- classification cost. Training with imbalanced data has encountered a significant difficulty of low at- tainable results. Although many previous work has addressed various strategies to tackle this is- sue, these techniques usually come with different drawbacks and the outcome is still very limited. Cao et al. introduced two new techniques, label- distribution-aware margin loss (LDAM) and de- ferred re-weighting(DRM) [1], which have been claimed to acieve better performance gains over the existing techniques. In this work, we re- produced the baseline experiments reported in the authors’ work with IMDB and CIFAR-10 bench- marks. We performed extensive hyper-parameter tuning on these models and outperformed the orig- inal reported results. We also proposed a general scheme for baseline improvement with learning rate step decay and triangular policy[2]. Based on the improved results, we studied how differ- ent techniques affect the performance when learn- ing imbalanced data (Section 6.3.4), including class balanced re-weighting[3], class balanced re- sampling[3] and borderline-SMOTE[4].
Track: Baseline
NeurIPS Paper Id: https://openreview.net/forum?id=HyMHHVSxLS
5 Replies

Loading