Neural Architecture Search Finds Robust Models by Knowledge Distillation

Published: 26 Apr 2024, Last Modified: 15 Jul 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Adversarial Attacks, Neural Architecture Search, Cross-Layer Knowledge Distillation
TL;DR: We propose a novel and robust Neural Architecture Search (NAS) method which improves the robustness of deep neural networks through cross-layer knowledge distillation.
Abstract: Despite their superior performance, Deep Neural Networks (DNNs) are often vulnerable to adversarial attacks. Neural Architecture Search (NAS), a method for automatically designing the architectures of DNNs, has shown remarkable performance across various machine learning applications. However, the adversarial robustness of architectures learned by NAS against adversarial threats remains under-explored. By integrating a robust teacher, we examine whether NAS can yield a robust neural architecture by inheriting robustness from the teacher. In this paper, we propose Robust Neural Architecture Search by Cross-Layer Knowledge Distillation (RNAS-CL), a novel NAS algorithm that enhances the robustness of architectures learned by NAS through employing cross-layer knowledge distillation from a robust teacher. Distinct from previous knowledge distillation approaches that only align student-teacher outputs at the final layer, RNAS-CL dynamically searches for the optimal teacher layer to guide each student layer. Our experimental findings validate the effectiveness of RNAS-CL, demonstrating that it can generate both compact and adversarially robust neural architectures. Our results pave the way for developing new strategies for compact and robust neural architecture design applicable across various fields. The code of RNAS-CL is available at \url{https://github.com/Statistical-Deep-Learning/RNAS-CL}.
List Of Authors: Nath,Utkarsh and Wang, Yancheng and Yang, Yingzhen
Latex Source Code: zip
Signed License Agreement: pdf
Code Url: https://github.com/Statistical-Deep-Learning/RNAS-CL
Submission Number: 778
Loading