Neural Operator SearchDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We propose a novel idea of Neural Operator Search (NOS) that incorporates additional operators into a NAS search space, searching more advanced architectures with self-calibration.
Abstract: Existing neural architecture search (NAS) methods explore a limited feature-transformation-only search space while ignoring other advanced feature operations such as feature self-calibration by attention and dynamic convolutions. This disables the NAS algorithms to discover more advanced network architectures. We address this limitation by additionally exploiting feature self-calibration operations, resulting in a heterogeneous search space. To solve the challenges of operation heterogeneity and significantly larger search space, we formulate a neural operator search (NOS) method. NOS presents a novel heterogeneous residual block for integrating the heterogeneous operations in a unified structure, and an attention guided search strategy for facilitating the search process over a vast space. Extensive experiments show that NOS can search novel cell architectures with highly competitive performance on the CIFAR and ImageNet benchmarks.
Keywords: deep learning, autoML, neural architecture search, image classification, attention learning, dynamic convolution
Original Pdf: pdf
10 Replies

Loading