Logic Pre-Training of Language ModelsDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Language Models, Pre-training, Logical Reasoning, Natural Language Understanding
Abstract: Pre-trained language models (PrLMs) have been shown useful for enhancing a broad range of natural language understanding (NLU) tasks. However, the capacity for capturing logic relations in challenging NLU still remains a bottleneck even for state-of-the-art PrLM enhancement, which greatly stalled their reasoning abilities. Thus we propose logic pre-training of language models, leading to the logic reasoning ability equipped PrLM, \textsc{Prophet}. To let logic pre-training perform on a clear, accurate, and generalized knowledge basis, we introduce \textit{fact} instead of the plain language unit in previous PrLMs. The \textit{fact} is extracted through syntactic parsing in avoidance of unnecessary complex knowledge injection. Meanwhile, it enables training logic-aware models to be conducted on a more general language text. To explicitly guide the PrLM to capture logic relations, three pre-training objectives are introduced: 1) logical connectives masking to capture sentence-level logics, 2) logical structure completion to accurately capture facts from the original context, 3) logical path prediction on a logical graph to uncover global logic relationships among facts. We evaluate our model on a broad range of NLP and NLU tasks, including natural language inference, relation extraction, and machine reading comprehension with logical reasoning. Results show that the extracted fact and the newly introduced pre-training tasks can help \textsc{Prophet} achieve significant performance in all the downstream tasks, especially in logic reasoning related tasks.
One-sentence Summary: We propose logic pre-training of language models to capture logic relations essentially, in consideration of the fundamental role of PrLM serving in NLP and NLU tasks.
Supplementary Material: zip
5 Replies

Loading