StacNAS: Towards Stable and Consistent Optimization for Differentiable Neural Architecture SearchDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Earlier methods for Neural Architecture Search were computationally expensive. Recently proposed Differentiable Neural Architecture Search algorithms such as DARTS can effectively speed up the computation. However, the current formulation relies on a relaxation of the original problem that leads to unstable and suboptimal solutions. We argue that these problems are caused by three fundamental reasons: (1) The difficulty of bi-level optimization; (2) Multicollinearity of correlated operations such as max pooling and average pooling; (3) The discrepancy between the optimization complexity of the search stage and the final training. In this paper, we propose a grouped variable pruning algorithm based on one-level optimization, which leads to a more stable and consistent optimization solution for differentiable NAS. Extensive experiments verify the superiority of the proposed method regarding both accuracy and stability. Our new approach obtains state-of-the-art accuracy on CIFAR-10, CIFAR-100 and ImageNet.
Code: https://github.com/susan0199/stacnas
Keywords: Differentiable Neural Architecture Search
Original Pdf: pdf
9 Replies

Loading