Graph2Seq: Graph to Sequence Learning with Attention-Based Neural NetworksDownload PDF

27 Sept 2018 (modified: 21 Apr 2024)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: The celebrated Sequence to Sequence learning (Seq2Seq) technique and its numerous variants achieve excellent performance on many tasks. However, many machine learning tasks have inputs naturally represented as graphs; existing Seq2Seq models face a significant challenge in achieving accurate conversion from graph form to the appropriate sequence. To address this challenge, we introduce a general end-to-end graph-to-sequence neural encoder-decoder architecture that maps an input graph to a sequence of vectors and uses an attention-based LSTM method to decode the target sequence from these vectors. Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings. We further introduce an attention mechanism that aligns node embeddings and the decoding sequence to better cope with large graphs. Experimental results on bAbI, Shortest Path, and Natural Language Generation tasks demonstrate that our model achieves state-of-the-art performance and significantly outperforms existing graph neural networks, Seq2Seq, and Tree2Seq models; using the proposed bi-directional node embedding aggregation strategy, the model can converge rapidly to the optimal performance.
Keywords: Graph Encoder, Graph Decoder, Graph2Seq, Graph Attention
TL;DR: Graph to Sequence Learning with Attention-Based Neural Networks
Code: [![github](/images/github_icon.svg) IBM/Graph2Seq](https://github.com/IBM/Graph2Seq) + [![Papers with Code](/images/pwc_icon.svg) 3 community implementations](https://paperswithcode.com/paper/?openreview=SkeXehR9t7)
Data: [WikiSQL](https://paperswithcode.com/dataset/wikisql)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 4 code implementations](https://www.catalyzex.com/paper/arxiv:1804.00823/code)
21 Replies

Loading