LEDGER: Scaling Agentic Document Editing with Dependency-aware Graph Retrieval

ACL ARR 2026 January Submission6493 Authors

05 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: AI Agent, Context Management, Graph, Document Editing
Abstract: Large language models increasingly power AI agents for tasks requiring iterative refinement: document editing demands targeted revisions while preserving cross-references, code refactoring requires tracking function dependencies, and knowledge base updates cascade through related entities. Iterative editing with AI agents faces a fundamental efficiency-consistency tradeoff: maintaining consistency requires full-context awareness of dependencies, but processing entire documents for each edit incurs prohibitive token costs and latency. Isolated edits improve efficiency but risk breaking cross-references and violating semantic constraints. We introduce {\ABBR} (scaLing Agentic document editing with Dependency-aware Graph rEtRieval), a framework that constructs lightweight dependency graphs capturing semantic relationships and structural hierarchies across document elements. For each edit, graph traversal identifies affected elements and retrieves only necessary context. Experiments across 1,900 test cases spanning six state-of-the-art models show {\ABBR} achieves 76$\%$ consistency versus 56$\%$ baseline while reducing token usage by 85$\%$ . Critically, {\ABBR} with low reasoning effort matches baseline performance at high reasoning effort using 70$\%$ fewer tokens, suggesting explicit dependency representations can substitute for expensive internal reasoning with implications for agentic systems operating on structured data.
Paper Type: Long
Research Area: AI/LLM Agents
Research Area Keywords: Agentic Systems, Document Editing, Graph-Based Memory, Dependency Tracking, Context Management
Contribution Types: NLP engineering experiment, Approaches low compute settings-efficiency, Publicly available software and/or pre-trained models, Data analysis
Languages Studied: English
Submission Number: 6493
Loading