Learning to Understand: Incorporating Local Contexts with Global Attention for Sentiment ClassificationDownload PDF

19 Apr 2024 (modified: 21 Jul 2022)Submitted to ICLR 2017Readers: Everyone
Abstract: Recurrent neural networks have shown their ability to construct sentence or paragraph representations. Variants such as LSTM overcome the problem of vanishing gradients to some degree, thus being able to model long-time dependency. Still, these recurrent based models lack the ability of capturing complex semantic compositions. To address this problem, we propose a model which can incorporate local contexts with the guide of global context attention. Both the local and global contexts are obtained through LSTM networks. The working procedure of this model is just like how we human beings read a text and then answer a related question. Empirical studies show that the proposed model can achieve state of the art on some benchmark datasets. Attention visualization also verifies our intuition. Meanwhile, this model does not need pretrained embeddings to get good results.
TL;DR: a global-local mutually representation-learning attention model for sentiment analysis
Conflicts: mails.tsinghua.edu.cn, tsinghua.edu.cn
Keywords: Natural language processing, Deep learning, Applications
7 Replies

Loading