Continuous-Discrete Alignment Optimization for efficient differentiable neural architecture search

Published: 2025, Last Modified: 15 Nov 2025Eng. Appl. Artif. Intell. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Differential Architecture Search (DARTS) has become a prominent technique for neural architecture search in recent years. Despite its merits, the issue of discretization discrepancy within DARTS still necessitates further exploration, as it can degrade in performance. In this paper, we introduce a novel algorithm termed Continuous–Discrete Alignment Optimization (DARTS-CDAO), designed to address the discretization discrepancy and thereby enhance the robustness and generalization capabilities of the discovered neural architectures. Our proposed DARTS-CDAO algorithm seamlessly integrates the discretization process into the training phase of the architecture parameters, thereby bolstering the search algorithm’s adaptability to the inherent discretization processes. Specifically, our methodology commences by formalizing the process of architecture parameter discretization. Subsequently, we introduce a coarse gradient weighting algorithm that is employed to update the architecture parameters, effectively minimizing the divergence between the representation of continuous and discrete parameters. Rigorous theoretical analysis, coupled with extensive experimental outcomes, substantiates that our proposed approach can elevate the performance of the searched models. Notably, this enhancement is achieved without incurring additional search time, rendering DARTS more robust and endowed with a heightened capacity for generalization.
Loading