Abstract: Neural Architecture Search (NAS) has received much attraction from the research community in recent years. However, due to the massive amount of computational resources required, it is still infeasible to employ NAS into research and production in small labs and companies. Recent methods in NAS that aim to speed up the evaluation process are often limit themselves in other aspects, such as the search space that NAS operates on. In this work, we propose TF-GeneNAS, an evolution-based training-free NAS approach with a dynamic search space and search strategy based on Gene Expression Programming. We conduct experiments on three tasks in both Computer Vision and Natural Language Processing domains to demonstrates the effectiveness of our method. With only 3 CPU days of searching needed, TF-GeneNAS can find network architectures with better performance than previous evolution-based methods, which can require days of GPU resources, thus significantly lower the cost of searching. We also perform further studies to show the impact of our training-free estimation strategy on the NAS process. We hope that our promising results can encourage further research into more efficient evolution-based NAS methods.
0 Replies
Loading