Amplifying commonsense knowledge via bi-directional relation integrated graph-based contrastive pre-training from large language models

Published: 01 Jan 2025, Last Modified: 19 Feb 2025Inf. Process. Manag. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•The first work to acquire extensive high-quality commonsense knowledge from LLM.•Sentence-level based forward and reverse relation enhanced contrastive learning.•Reverse relations enhanced bidirectional learning to mitigate reversal curse issue.•Producing extensive high-quality commonsense knowledge to expand current KBs.
Loading