Abstract: Recent knowledge editing methods have predominantly concentrated on modifying structured triplet knowledge within large language models. Compared to triplet-based knowledge, unstructured knowledge contains richer and more interrelated information, which increases the difficulty of editing. When relying solely on parameter-based editing methods, similar knowledge may interfere with each other due to their semantic overlap. Although previous studies have shown that directly applying in-context editing to unstructured knowledge with better results than parameter-based approaches, there is still considerable room for improvement.
Previous studies have found that large language models are highly sensitive to the sequence of long text information,even the core content of the text may be masked due to positional influence. This indicates that, after rewriting unstructured facts, LLMs(Large Language Models) are better able to process and utilize the rewritten facts than the original facts. Inspired by this idea, we propose EIKEA(Enhancing In-Context Knowledge Editing by Agents), a novel method that combines rewriting agent with IKE (In-Context Knowledge Editing), enabling language models to effectively internalize unstructured factual updates without modifying model parameters. We conduct comprehensive experiments on the WIKIUPDATE subset of the AKEW benchmark, demonstrating that our method significantly improves editing accuracy over baseline IKE and parameter-editing methods. Our method provides a practical, lightweight, and scalable solution to unstructured knowledge editing.
Submission Number: 245
Loading