Exploiting model capacity by constraining within-batch features to be orthogonalDownload PDF

10 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: Deep networks have been shown to greatly benefit from large model capacity when trained using various recent deep learning techniques. But at the same time, features in such large capacity networks have a potential to be redundant. In this work, we propose a new regularization method to exploit the given network capacity effectively. By minimizing the redundancy among in-layer filters and the correlation between in-batch features at the same time, we are able to achieve better performance with the same network architecture. Experiments with CIFAR-10/100 show that simultaneously constraining both the in-layer filters to be orthonormal and the in-batch features to be orthogonal is beneficial in efficiently utilizing the model capacity.
TL;DR: Reduce redundancy between features by controlling correlation of filters and their features concurrently
Keywords: regularization
9 Replies

Loading