RCapsNet: A Recurrent Capsule Network for Text ClassificationDownload PDFOpen Website

Published: 01 Jan 2020, Last Modified: 13 May 2023IJCNN 2020Readers: Everyone
Abstract: In this paper, we propose RCapsNet, a recurrent capsule network for text classification. Although a variety of neural networks have been proposed recently, existing models are mainly based either on RNN or on CNN, which are rather limited in encoding temporal features in these network structures. In addition, most of these models require to integrate prior linguistic knowledge into them, which is not practical for a non-linguistician to handcraft such knowledge. To address these issues on temporal relational variabilities in text classification, the RCapsNet is presented by employing a hierarchy of recurrent structure-based capsules. It consists of two components: the recurrent module considered as the backbone of the RCapsNet and the reconstruction module designed to enhance the generalization capability of the model. Empirical evaluations on four benchmark datasets demonstrate the competitiveness of the RCapsNet. In particular, it is shown that prior linguistic knowledge is dispensable for the training of our model.
0 Replies

Loading