Keywords: graph neural network, neural architecture search, automated machine learning
Abstract: Recently, graph neural networks (GNN) have been demonstrated effective in various graph-based tasks.
To obtain state-of-the-art (SOTA) data-specific GNN architectures, researchers turn to the neural architecture search (NAS) methods.
However, it remains to be a challenging problem to conduct efficient architecture search for GNN.
In this work, we present a novel framework for Efficient GrAph Neural architecture search (EGAN).
By designing a novel and expressive search space, an efficient one-shot NAS method based on stochastic relaxation and natural gradient is proposed.
Further, to enable architecture search in large graphs, a transfer learning paradigm is designed.
Extensive experiments, including node-level and graph-level tasks, are conducted. The results show that the proposed EGAN can obtain SOTA data-specific architectures, and reduce the search cost by two orders of magnitude compared to existing NAS baselines.
One-sentence Summary: We propose an effective and efficient framework for graph neural architecture search, which is very important for graph-based tasks.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=idR1i1d10
11 Replies
Loading