Assessing The Importance Of Colours For CNNs In Object RecognitionDownload PDF

Published: 03 Nov 2020, Last Modified: 22 Oct 2023SVRHM@NeurIPS PosterReaders: Everyone
Keywords: Image classification, colour bias, incongruent evaluation
TL;DR: Assessing importance of colours for CNNs with experiments motivated from stroop effect.
Abstract: Humans rely heavily on shapes as a primary cue for object recognition. As secondary cues, colours and textures are also beneficial in this regard. Convolutional neural networks (CNNs), an imitation of biological neural networks, have been shown to exhibit conflicting properties. Some studies indicate that CNNs are biased towards textures whereas, another set of studies suggest shape bias for a classification task. However, they do not discuss the role of colours implying its possible humble role in the task of object recognition. In this paper, we empirically investigate the importance of colours in object recognition for CNNs. We are able to demonstrate that CNNs often rely heavily on colour information while making a prediction. Our results show that the degree of dependency on colours tend to vary from one dataset to another. Moreover, networks tend to rely more on colours if trained from scratch. Pre-training can allow the model to be less colour dependent. However, if it is forced to rely less on colours through data augmentations, this can negatively affect the overall accuracy. To facilitate these findings, we follow the framework often deployed in understanding role of colours in object recognition for humans. We evaluate a model trained with congruent images (images in original colours eg.\ red strawberries) on congruent, greyscale, and incongruent images (images in unnatural colours eg.\ blue strawberries). We measure and analyse network's predictive performance (top-1 accuracy) under these different stylisations. We utilise standard datasets of supervised image classification (CIFAR-100, STL-10, and Tiny ImageNet) and fine-grained image classification (CUB-200-2011, Oxford-Flowers, and Oxford-IIIT Pets) in our experiments.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2012.06917/code)
5 Replies

Loading