Shapley-NAS: Discovering Operation Contribution for Neural Architecture SearchDownload PDF

29 Sept 2021 (modified: 22 Oct 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Abstract: In this paper, we propose a Shapley value based operation contribution evaluation method (Shapley-NAS) for neural architecture search. Differentiable architecture search (DARTS) acquires the expected architectures by optimizing the architecture parameters with gradient descent, which benefits from the high efficiency due to the significantly reduced search cost. However, DARTS leverages the learnable architecture parameters of the supernet to represent the operation importance during the search process, which fails to reveal the actual impacts of operations on the task performance and therefore harms the effectiveness of obtained architectures. On the contrary, we evaluate the direct influence of operations on accuracy via Shapley value for supernet optimization and architecture discretization, so that the optimal architectures are acquired by selecting the operations that contribute significantly to the tasks. Specifically, we iteratively employ Monte-Carlo sampling based algorithm with early truncation to efficiently approximate the Shapley value of operations, and update weights of the supernet whose architecture parameters are assigned with the operation contribution evaluated by Shapley value. At the end of the search process, operations with the largest Shapley value are preserved to form the final architecture. Extensive experiments on CIFAR-10 and ImageNet for image classification and on NAS-Bench-201 for optimal architecture search show that our Shapley-NAS outperforms the state-of-the-art methods by a sizable margin with light search cost.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:2206.09811/code)
6 Replies

Loading