Document Context Language ModelsDownload PDF

29 Nov 2024 (modified: 18 Feb 2016)ICLR 2016Readers: Everyone
CMT Id: 213
Abstract: Text documents are structured on multiple levels of detail: individual words are related by syntax, and larger units of text are related by discourse structure. Existing language models generally fail to account for discourse structure, but it is crucial if we are to have language models that reward coherence and generate coherent texts. We present and empirically evaluate a set of multi-level recurrent neural network language models, called Document-Context Language Models (DCLMs), which incorporate contextual information both within and beyond the sentence. In comparison with sentence-level recurrent neural network language models, the DCLMs obtain slightly better predictive likelihoods, and considerably better assessments of document coherence.
Conflicts: gatech.edu, unimelb.edu.au, cs.cmu.edu
0 Replies

Loading