Multi-source Multi-view Transfer Learning in Neural Topic Modeling with Pretrained Topic and Word Embeddings
TL;DR: Transfer learning in Neural Topic Modeling using Pretrained Word and Topic Embeddings jointly from one or many sources to improve quality of topics and document representations in sparse-data settings
Abstract: Though word embeddings and topics are complementary representations, several
past works have only used pretrained word embeddings in (neural) topic modeling
to address data sparsity problem in short text or small collection of documents.
However, no prior work has employed (pretrained latent) topics in transfer learning
paradigm. In this paper, we propose a framework to perform transfer learning
in neural topic modeling using (1) pretrained (latent) topics obtained from a large
source corpus, and (2) pretrained word and topic embeddings jointly (i.e., multiview)
in order to improve topic quality, better deal with polysemy and data sparsity
issues in a target corpus. In doing so, we first accumulate topics and word representations
from one or many source corpora to build respective pools of pretrained
topic (i.e., TopicPool) and word embeddings (i.e., WordPool). Then, we identify
one or multiple relevant source domain(s) and take advantage of corresponding
topics and word features via the respective pools to guide meaningful learning
in the sparse target domain. We quantify the quality of topic and document representations
via generalization (perplexity), interpretability (topic coherence) and
information retrieval (IR) using short-text, long-text, small and large document
collections from news and medical domains. We have demonstrated the state-ofthe-
art results on topic modeling with the proposed transfer learning approaches.
Code: https://drive.google.com/drive/folders/1tBFEfEapcCYw6JZ6Gi-KtPrJwH6D2KZK
Keywords: Neural Topic Modeling, Transfer Learning, Unsupervised learning, Natural Language Processing
Original Pdf: pdf
10 Replies
Loading