Why Does DARTS Miss the Target, and How Do We Aim to Fix It?

Anonymous

17 Jan 2022 (modified: 05 May 2023)Submitted to BT@ICLR2022Readers: Everyone
Keywords: NAS, DARTS, neural architecture search, neural networks, deep learning, differentiable NAS, differentiable neural architecture search
Abstract: This blog post is based on 'Rethinking Architecture Selection in Differentiable NAS' from ICLR 2021 (Wang et al., 2021). The post establishes context by explaining DARTS (Liu et al., 2019) and summarizes an analysis of the failure modes of DARTS (Zela et al., 2020) before returning to focus on the main work. The post compares potential causes for failure of DARTS presented by (Zela et al., 2020) and (Wang et al., 2021) before describing the perturbation algorithm presented in (Wang et al., 2021) in order to help remedy failings of DARTS. Finally, the post reflects on the perturbation algorithm's downside as a step away from differentiable NAS (and toward discrete NAS) and relates the supernet-based pruning of DARTS to broader literature in network pruning.
Submission Full: zip
Blogpost Url: yml
ICLR Paper: https://arxiv.org/abs/2108.04392
2 Replies

Loading