Amplifying commonsense knowledge via bi-directional relation integrated graph-based contrastive pre-training from large language models
Abstract: Highlights•The first work to acquire extensive high-quality commonsense knowledge from LLM.•Sentence-level based forward and reverse relation enhanced contrastive learning.•Reverse relations enhanced bidirectional learning to mitigate reversal curse issue.•Producing extensive high-quality commonsense knowledge to expand current KBs.
Loading