Abstract: We propose a framework for generating knowledge consistent and context-relevant dialogues with a knowledge graph (KG), named SUbgraph Retrieval-augmented GEneration (SURGE).
First, our method retrieves the context-relevant subgraph from the KG, and then enforces consistency across the facts by perturbing their word embeddings conditioned on the retrieved subgraph.
Then, it learns the latent representation space using graph-text multi-modal contrastive learning which ensures that the generated texts have high similarity to the retrieved subgraphs. We validate the performance of our SURGE framework on the OpendialKG dataset and show that our method generates high-quality dialogues that faithfully reflect the knowledge from the KG.
0 Replies
Loading