Graph Recurrent Neural Network for Text ClassificationDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=hj7IW5KMLgJ
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: Graph Neural Networks(GNNs) application to text classification is currently one of the most popular fields. Most GNNs-based models only focus on the interaction of words in the document, whereas the word order is ignored, and the related semantic information is lost. In addition, when the graph density increases, the word nodes become over-smooth. As a result, the semantic information of the document is destroyed. In this paper, TextGRNN, a text classification method based on GNN is proposed to solve the above problems. First, our proposed model constructs the document-level graph via Visibility Graph, in which the graph density is restrained, and updates the word representations by GNN. Then the TextGRNN model utilizes Bi-LSTM that can recognize word order to learn the semantic information of the document. Finally, the attention mechanism is used to highlight the essential words. Numerous experiments on three benchmark datasets demonstrate that our model is preferable to state-of-the-art text classification methods.
0 Replies

Loading