MAGEO: Memory-Augmented Multi-Agent Generative Engine Optimization

ACL ARR 2026 January Submission5507 Authors

05 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: LLM, GEO, Multi-Agent
Abstract: Generative Engines (GEs) reshape digital ecosystems by transitioning from ranked links to citation-grounded generation. Mirroring the evolution of semantic search, this shift motivates us to ask: Can creators systematically optimize content influence while ensuring attribution fidelity in black-box engines? In this work, we explore the Generative Engine Optimization (GEO) paradigm and introduce MSME GEO Bench—a comprehensive benchmark grounded in real-world queries. We also propose MAGEO, a memory-augmented optimizer refining content via collaborative agents and cross-instance memory. To ensure rigorous assessment, we introduce a Twin Branch Evaluation Protocol to isolate causal impacts and a dual-axis metric, DSV-CF, to penalize misattribution. Empirical results show MAGEO significantly enhances visibility and citation accuracy across mainstream engines. These findings establish a path toward transparent and trustworthy creator-traffic ecosystems through systematic GEO. Our source code is available at https://anonymous.4open.science/r/MAGEO-3B90.
Paper Type: Long
Research Area: Retrieval-Augmented Language Models
Research Area Keywords: retrieval-augmented generation,LLM/AI agents
Contribution Types: Model analysis & interpretability, NLP engineering experiment, Data resources
Languages Studied: English
Submission Number: 5507
Loading