Operation-Level Early Stopping for Robustifying Differentiable NAS

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Differentiable neural architecture search; Image classification; Failure of DARTS
TL;DR: Analyzing the domination of skip connections issue in DARTS from the perspective of overfitting of operations in the supernet and proposing the operation-level early stopping method to robustify DARTS.
Abstract: Differentiable NAS (DARTS) is a simple and efficient neural architecture search method that has been extensively adopted in various machine learning tasks. % Nevertheless, DARTS still encounters several robustness issues, mainly the domination of skip connections. % The resulting architectures are full of parametric-free operations, leading to performance collapse. % Existing methods suggest that the skip connection has additional advantages in optimization compared to other parametric operations and propose to alleviate the domination of skip connections by eliminating these additional advantages. % In this paper, we analyze this issue from a simple and straightforward perspective and propose that the domination of skip connections results from parametric operations overfitting the training data while architecture parameters are trained on the validation data, leading to undesired behaviors. % Based on this observation, we propose the operation-level early stopping (OLES) method to overcome this issue and robustify DARTS without introducing any computation overhead. % Extensive experimental results can verify our hypothesis and the effectiveness of OLES.
Submission Number: 11229
Loading