FEATHERS: Federated Architecture and Hyperparameter Search

Published: 12 Jul 2024, Last Modified: 09 Aug 2024AutoML 2024 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: neural architecture search, hyperparameter optimization, federated learning
Abstract: Deep neural architectures have a profound impact on achieved performance in many of today's AI tasks, yet their design still heavily relies on human prior knowledge and experience. Neural architecture search (NAS) together with hyperparameter optimization (HO) helps to reduce this dependence. However, state-of-the-art NAS and HO rapidly become infeasible with increasing amounts of data being stored in a distributed fashion. This is mainly because these methods are not designed for distributed environments and typically violate data privacy regulations such as GDPR and CCPA. As a remedy, we introduce FEATHERS - **FE**derated **A**rchi**T**ecture and **H**yp**ER**parameter **S**earch, a method that not only optimizes both neural architectures \textit{and} optimization-related hyperparameters jointly in distributed data settings, but further provably preserves data privacy through the use of differential privacy (DP). We show that FEATHERS efficiently optimizes architectural and optimization-related hyperparameters alike while demonstrating convergence on classification tasks at no detriment to model performance when complying with privacy constraints.
Submission Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Optional Meta-Data For Green-AutoML: All questions below on environmental impact are optional.
Evaluation Metrics: Yes
Submission Number: 8
Loading