CtxWF: Context Window Focus with Global Management for LLM Agents in Multi-Document Workspace

ACL ARR 2025 February Submission7042 Authors

16 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: LLM-based agent systems have achieved remarkable progress in automatically solving natural language processing tasks. However, real-world tasks often involve working within a multi-file workspace that requires exploratory implementation of specific objectives, demanding LLMs to acquire, process, and manage substantial information from workspace data sources. Due to the limited attention span of LLMs, excessive or disorganized information can lead to distraction from core objectives during reasoning, ultimately resulting in suboptimal outputs. To enhance LLMs' capability in handling complex real-world tasks, inspired by human problem-solving strategies, we propose CtxWF, a context window-focused agent that resolves long-term complex tasks through global context management and concentrated execution of short-term sub-tasks. CtxWF features three key innovations: (1) Proactive acquisition of essential contextual information prior to task resolution, (2) Single-responsibility specialization of LLM reasoning to reduce context window requirements, (3) Refinement of environmental feedback for context updates to enhance information quality post short-term task execution. We showcase the effectiveness of CtxWF on agent-based data science tasks, where it achieves state-of-the-art accuracy across multiple models. The GPT-4o-powered CtxWF attains an accuracy of 42.26\%, representing a 10.01\% improvement over baseline methods.
Paper Type: Long
Research Area: Language Modeling
Research Area Keywords: LLM-based agent, Machine Learning for NLP, Generation
Contribution Types: NLP engineering experiment, Publicly available software and/or pre-trained models, Data analysis
Languages Studied: N/A
Submission Number: 7042
Loading