Share or Split : which is more efficient?Download PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: In this paper, we investigate two different feature representations in convolutional neural networks (CNN): (a) Share - all classes share the same feature space, and a fully connected layer is used to decode class activation, and (b) Split - each class has its own feature space and a class is activated if its corresponding feature vector has a large norm. This is inspired by Capsules (\cite{Sabour17capsules}) which splits the feature space. We compare these two representations on a transformed MNIST dataset, which adds random scales and translations on the original digits. The experimental results show that Split has better performance when data is limited, while Share is better when data is big.
4 Replies

Loading