Better Optimization for Neural Architecture Search with Mixed-Level ReformulationDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Many recently proposed methods for Neural Architecture Search (NAS) can be formulated as bilevel optimization. For efficient implementation, its solution requires approximations of second-order methods. In this paper, we demonstrate that gradient errors caused by such approximation leads to suboptimality, in that such a procedure fails to converge to a (locally) optimal solution. To remedy this problem, this paper proposes MiLeNAS, a mixed-level reformulation for Neural Architecture Search that can be optimized more reliably. It is shown that even when using a simple first-order method on mixed-level formulation, MiLeNAS can achieve lower validation error for NAS problems. Consequently, architectures obtained by our method achieve consistently higher accuracies than those obtained from bilevel optimization. Moreover, the use of first-order updates in our method also leads to faster training. Extensive experiments within convolutional architecture search space validate the effectiveness of our approach.
Code: https://tinyurl.com/milenas-code
Original Pdf: pdf
4 Replies

Loading