Abstract: The recently introduced Paragraph Vector is an efficient method for learning high-quality distributed representations for pieces of texts. However, an inherent limitation of Paragraph Vector is lack of ability to infer distributed representations for texts outside of the training set. To tackle this problem, we introduce a Generative Paragraph Vector, which can be viewed as a probabilistic extension of the Distributed Bag of Words version of Paragraph Vector with a complete generative process. With the ability to infer the distributed representations for unseen texts, we can further incorporate text labels into the model and turn it into a supervised version, namely Supervised Generative Paragraph Vector. In this way, we can leverage the labels paired with the texts to guide the representation learning, and employ the learned model for prediction tasks directly. Experiments on five text classification benchmark collections show that both model architectures can yield superior classification performance over the state-of-the-art counterparts.
TL;DR: With a complete generative process, our models are able to infer vector representations as well as labels over unseen texts.
Conflicts: ict.ac.cn
Keywords: Natural language processing, Deep learning, Unsupervised Learning, Supervised Learning
6 Replies
Loading