Structured Neural SummarizationDownload PDF

27 Sept 2018, 22:37 (modified: 04 Mar 2022, 14:26)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Keywords: Summarization, Graphs, Source Code
TL;DR: One simple trick to improve sequence models: Compose them with a graph model
Abstract: Summarization of long sequences into a concise statement is a core problem in natural language processing, requiring non-trivial understanding of the input. Based on the promising results of graph neural networks on highly structured data, we develop a framework to extend existing sequence encoders with a graph component that can reason about long-distance relationships in weakly structured data such as text. In an extensive evaluation, we show that the resulting hybrid sequence-graph models outperform both pure sequence models as well as pure graph models on a range of summarization tasks.
Code: [![github](/images/github_icon.svg) CoderPat/structured-neural-summarization](https://github.com/CoderPat/structured-neural-summarization) + [![Papers with Code](/images/pwc_icon.svg) 2 community implementations](https://paperswithcode.com/paper/?openreview=H1ersoRqtm)
Data: [CNN/Daily Mail](https://paperswithcode.com/dataset/cnn-daily-mail-1)
15 Replies

Loading