Keywords: node classification, few-shot learning, graph neural networks
TL;DR: In this paper, we propose Few-shot Node Prompt Tuning as a effective method to tackle general few-shot node classification tasks.
Abstract: Despite the powerful representation ability of GNNs, recent works have demonstrated that the performance of GNNs can severely degrade when the number of labeled nodes is limited in training data. \textit{Few-shot Node Classification} is one of the problems with an extreme shortage of node labels and has drawn growing attention lately. The current modus operandi, i.e., meta-learning, has succeeded in transferring the structural knowledge learned from \textit{base classes} with abundant labeled nodes to few-shot \textit{novel classes}. However, for real-world scenarios, it is often the case that all the classes on the graph have limited labeled nodes, thus meta-learning cannot be directly deployed. In this work, we generalize the few-shot node classification by removing the assumption that there exist abundant labeled nodes for the base classes. In the meantime, we propose a novel \textit{Few-shot Node Prompt Tuning} method to effectively elicit substantial prior knowledge in the input graph for solving few-shot node classification tasks without labeled base classes. Specifically, we fix a pretrained graph transformer as the encoder and inject virtual nodes as soft prompts in the embedding space to bridge the gap of training objectives between the pretexts and downstream few-shot node classification tasks. Such prompts are small tensors and can be efficiently optimized with a simple classifier corresponding to the few labeled nodes. Since a single pretrained encoder is shared across different tasks, the proposed method retains the efficiency and potential for the model ensemble. Extensive experiments on four prevalent node classification datasets show that the proposed method, FS-NPT, is an efficient and effective way to tackle the general few-shot node classification problem. Our implementation is released\footnote{\url{https://github.com/Anonymous-submit-23/FS-NPT.git}}.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
4 Replies
Loading