Pretrained Language Models to Solve Graph Tasks in Natural Language

Published: 19 Jun 2023, Last Modified: 28 Jul 20231st SPIGM @ ICML PosterEveryoneRevisionsBibTeX
Keywords: Large language models, graph neural networks
TL;DR: We explore if large language models can learn from graph-structured data when the graphs are described using natural language.
Abstract: Pretrained large language models (LLMs) are powerful learners in a variety of language tasks. We explore if LLMs can learn from graph-structured data when the graphs are described using natural language. We explore data augmentation and pretraining specific to the graph domain and show that LLMs such as GPT-2 and GPT-3 are promising alternatives to graph neural networks.
Submission Number: 108
Loading