K-Edit: Language Model Editing with Contextual Knowledge Awareness

Published: 13 Jan 2025, Last Modified: 26 Feb 2025AAAI 2025 PDLM OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: LLM, Model Editing, Question Answering, Knowledge Graph
TL;DR: We add contextual information from knowledge graphs to improve the quality of model editing techniques
Abstract: As the world changes, we need to be able to update our models and correct false information without costly retraining. Knowledge-based model editing enables precise modifications to the weights of large language models in order to modify the information encoded within. Recent approaches have seen success in enabling recall of edited information for thousands of edits at once. However, these approaches fail to produce edits that account for associated contextual information. We present K-Edit, an effective approach to generating contextually consistent knowledge edits. By using knowledge graphs, which maintain contextual consistency when an edge is edited, we are able to generate additional contextual edits that ensure consistency of related information in the language model. Our experiments demonstrate significant improvements in multi-hop question answering while maintaining the general effectiveness and scalability of model edits.
Submission Number: 34
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview