Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Cluster-based Warm-Start Nets
Nov 07, 2017 (modified: Nov 07, 2017)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:Theories in cognitive psychology postulate that humans use similarity as a basis
for object categorization. However, work in image classification generally as-
sumes disjoint and equally dissimilar classes to achieve super-human levels of
performance on certain datasets. In our work, we adapt notions of similarity using
weak labels over multiple hierarchical levels to boost classification performance.
Instead of pitting clustering directly against classification, we use a warm-start
based evaluation to explicitly provide value to a clustering representation by its
ability to aid classification. We evaluate on CIFAR10 and a fine-grained classifi-
cation dataset to show improvements in performance with the procedural addition
of intermediate losses and weak labels based on multiple hierarchy levels. Further-
more, we show that pretraining AlexNet on hierarchical weak labels in conjunc-
tion with intermediate losses outperforms a classification baseline by over 17% on
a subset of Birdsnap dataset. Finally, we show improvement over AlexNet trained
using ImageNet pre-trained weights as initializations which further supports our
claim of the importance of similarity.
TL;DR:Cluster before you classify; using weak labels to improve classification