EcoNAS: Carbon and Cost-Aware Neural Architecture Search for Edge Vision Applications

Published: 30 Apr 2026, Last Modified: 30 Apr 2026CVPR-NAS26 OralEveryoneRevisionsCC BY 4.0
Keywords: Neural Architecture Search, Training-free NAS, Multi-objective optimization, Hardware-aware NAS, Energy-efficient deep learning, Cost-aware AI, Pareto optimization, Edge deployment
TL;DR: We introduce EcoNAS, a training-free, multi-objective neural architecture search framework that jointly optimizes accuracy, energy consumption, and deployment cost to discover efficient models for real-world edge hardware.
Abstract: Neural Architecture Search (NAS) has emerged as a powerful tool for automating the design of high-performance neural networks. While existing methods typically focus on optimizing accuracy or latency, practical considerations such as energy consumption and deployment cost remain underexplored, particularly in resource-constrained edge environments. In this work, we propose EcoNAS, a multi-objective NAS framework that jointly optimizes model accuracy, energy efficiency, and estimated deployment cost. EcoNAS employs a hardware-aware search space alongside training-free performance proxies, enabling rapid evaluation of candidate architectures and efficient exploration of the Pareto front without requiring full model training. We conduct extensive experiments on image classification and segmentation tasks across multiple edge platforms, including NVIDIA Jetson Nano, Raspberry Pi 4, and a simulated Edge TPU. Our results demonstrate that EcoNAS identifies architectures achieving competitive accuracy while substantially reducing energy consumption and deployment cost relative to standard NAS baselines and widely used manually designed networks. We further provide ablation studies examining the impact of proxy selection, search algorithm, and hardware modeling, alongside comprehensive implementation details to facilitate reproducibility.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 1
Loading