Abstract: How to generate informative, coherent natural language is a very important task. Previous studies mainly focus on leveraging commonsense knowledge into generative models, which can improve the informativeness of generated texts. However, these models pay little attention to discourse coherence. Instead, we propose to utilize event chains to improve the coherence of text generation. In addition, we devise an inductive encoding module to reduce the sparsity of introduced event chains and learn the useful event evolution patterns. Specifically, we first extract event chains for the input text and then connect them as a graph. The inductive graph encoding module is then used to learn the inductive and generalized event embeddings. The event reasoning flow module follows and produces the event sketch, i.e., the reasonable events conditioned by the input text. Finally, we generate the text based on the input context and the event sketch. Experimental results indicate the effectiveness of this framework in terms of coherence and informativeness of text generation.
0 Replies
Loading