Context-Gated ConvolutionDownload PDF

25 Sept 2019 (modified: 22 Oct 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
TL;DR: A novel Context-Gated Convolution which incorporates global context information into CNNs by explicitly modulating convolution kernels, and thus captures more representative local patterns and extract discriminative features.
Abstract: As the basic building block of Convolutional Neural Networks (CNNs), the convolutional layer is designed to extract local patterns and lacks the ability to model global context in its nature. Many efforts have been recently made to complement CNNs with the global modeling ability, especially by a family of works on global feature interaction. In these works, the global context information is incorporated into local features before they are fed into convolutional layers. However, research on neuroscience reveals that, besides influences changing the inputs to our neurons, the neurons' ability of modifying their functions dynamically according to context is essential for perceptual tasks, which has been overlooked in most of CNNs. Motivated by this, we propose one novel Context-Gated Convolution (CGC) to explicitly modify the weights of convolutional layers adaptively under the guidance of global context. As such, being aware of the global context, the modulated convolution kernel of our proposed CGC can better extract representative local patterns and compose discriminative features. Moreover, our proposed CGC is lightweight, amenable to modern CNN architectures, and consistently improves the performance of CNNs according to extensive experiments on image classification, action recognition, and machine translation.
Keywords: Convolutional Neural Network, Context-Gated Convolution, Global Context Information
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:1910.05577/code)
Original Pdf: pdf
7 Replies

Loading