CoSe-Co: Text Conditioned Generative CommonSense ContextualizerDownload PDF

29 Sept 2021 (modified: 22 Oct 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Language Model, Commonsense, Knowledge Graph, Task Agnostic, Novel Sentence-to-Path Dataset
Abstract: Pre-trained Language Models (PTLMs) have been shown to perform well on natural language tasks. Many prior works have attempted to leverage structured commonsense present in the form of entities linked through labeled relations in Knowledge Graphs (KGs) to assist PTLMs. Retrieval approaches use KG as a separate static module which limits coverage since KGs contain finite knowledge. Generative methods train PTLMs on KG triples to scale the knowledge. However, training on symbolic KG entities limits their application in tasks involving natural language text where they ignore overall context. To mitigate this, we propose a task agnostic CommonSense Contextualizer (CoSe-Co) conditioned on sentences as input to make it generically usable in NLP tasks for generating contextually relevant knowledge in the form of KG paths. We propose a novel dataset comprising of sentence and commonsense path pairs to train CoSe-Co. The knowledge paths inferred by CoSe-Co are diverse, relevant and contain novel entities not present in the underlying KG. Additionally, we show CoSe-Co can be used for KG completion. We augment the generated knowledge in Multi-Choice QA and Open-ended CommonSense Reasoning tasks leading to improvements over current best methods (upto ~3% and ~7% respectively) on CSQA, ARC, QASC and OBQA datasets. Further, improved performance is seen in low training data regimes which shows CoSe-Co knowledge helps in generalising better.
One-sentence Summary: A sentence conditioned LM based generative commonsense contextualiser trained to generate commonsense paths given a sentence as input.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 4 code implementations](https://www.catalyzex.com/paper/arxiv:2206.05706/code)
25 Replies

Loading