Differentiable Channel Selection for Self-AttentionDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Differentiable Channel Selection, Self-Attention, Differentiable Neural Architecture Search, Re-IDentification, Object Detection, Image Classification
TL;DR: We propose Differentiable Channel Selection (DCS) which searches for informative channels so as to compute semantic attention weights in a self-attention module.
Abstract: Self-attention has been widely used in deep learning, and recent efforts have been devoted to incorporating self-attention modules into convolutional neural networks for computer vision. In this paper, we propose a novel attention module termed Differentiable Channel Selection (DCS). In contrast with conventional self-attention, DCS searches for the locations and key dimension of channels in a continuous space by a novel differentiable searching method. Our DCS module is compatible with either fixed neural network backbone or learnable backbone with Differentiable Neural Architecture Search (DNAS), leading to DCS with Fixed Backbone (DCS-FB) and DCS-DNAS respectively. We apply DCS-FB and DCS-DNAS to three computer vision tasks, person Re-IDentification methods (Re-ID), object detection, and image classification, with state-of-the-art results on standard benchmarks and compact architecture compared to competing methods, revealing the advantage of DCS.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
5 Replies

Loading