Towards Fast Adaptation of Neural Architectures with Meta LearningDownload PDF

Published: 20 Dec 2019, Last Modified: 05 May 2023ICLR 2020 Conference Blind SubmissionReaders: Everyone
Keywords: Fast adaptation, Meta learning, NAS
Abstract: Recently, Neural Architecture Search (NAS) has been successfully applied to multiple artificial intelligence areas and shows better performance compared with hand-designed networks. However, the existing NAS methods only target a specific task. Most of them usually do well in searching an architecture for single task but are troublesome for multiple datasets or multiple tasks. Generally, the architecture for a new task is either searched from scratch, which is neither efficient nor flexible enough for practical application scenarios, or borrowed from the ones searched on other tasks, which might be not optimal. In order to tackle the transferability of NAS and conduct fast adaptation of neural architectures, we propose a novel Transferable Neural Architecture Search method based on meta-learning in this paper, which is termed as T-NAS. T-NAS learns a meta-architecture that is able to adapt to a new task quickly through a few gradient steps, which makes the transferred architecture suitable for the specific task. Extensive experiments show that T-NAS achieves state-of-the-art performance in few-shot learning and comparable performance in supervised learning but with 50x less searching cost, which demonstrates the effectiveness of our method.
TL;DR: A meta-learning method for fast adaptation of neural architectures.
Code: [![github](/images/github_icon.svg) dongzelian/T-NAS](https://github.com/dongzelian/T-NAS)
Data: [FC100](https://paperswithcode.com/dataset/fc100), [ImageNet](https://paperswithcode.com/dataset/imagenet), [mini-Imagenet](https://paperswithcode.com/dataset/mini-imagenet)
Original Pdf: pdf
7 Replies

Loading