Cluster-based Warm-Start NetsDownload PDF

Anonymous

05 Jan 2018 (modified: 25 Jan 2018)ICLR 2018 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Theories in cognitive psychology postulate that humans use similarity as a basis for object categorization. However, work in image classification generally as- sumes disjoint and equally dissimilar classes to achieve super-human levels of performance on certain datasets. In our work, we adapt notions of similarity using weak labels over multiple hierarchical levels to boost classification performance. Instead of pitting clustering directly against classification, we use a warm-start based evaluation to explicitly provide value to a clustering representation by its ability to aid classification. We evaluate on CIFAR10 and a fine-grained classifi- cation dataset to show improvements in performance with the procedural addition of intermediate losses and weak labels based on multiple hierarchy levels. Further- more, we show that pretraining AlexNet on hierarchical weak labels in conjunc- tion with intermediate losses outperforms a classification baseline by over 17% on a subset of Birdsnap dataset. Finally, we show improvement over AlexNet trained using ImageNet pre-trained weights as initializations which further supports our claim of the importance of similarity.
TL;DR: Cluster before you classify; using weak labels to improve classification
Withdrawal: Confirmed
Keywords: hierarchical labels, weak labels, pairwise constraints, clustering, classification
7 Replies

Loading