CycleGen: Closing the Loop in Long-Form Text Generation via Loop-Aware Refinement

ACL ARR 2026 January Submission835 Authors

25 Dec 2025 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Long-Form Text Generation; Loop-Aware Refinement; Cyclic Semantic Dependency; Data Synthesis; Progressive Alignment
Abstract: Long-Form text generation is critical for applications ranging from creative writing to technical documentation, yet large language models (LLMs) struggle to maintain structural integrity and logical coherence over extended outputs. Existing linear generation paradigms inadequately capture cyclic semantic dependencies, in which subsequent content retroactively constrains preceding segments. To address this, we propose CycleGen, a framework that reformulates long-form text generation as a graph-based cyclic synthesis process. CycleGen comprises: (1) a data engine that constructs dependency graphs, applies a MinFAS-inspired algorithm to resolve logical cycles, and encodes backward constraints via closed-loop correction; and (2) a progressive alignment strategy that decouples optimization objectives through multi-stage training with Graph-Critical IPO, DPO, and SimPO. We further introduce CycleBench for evaluating long-range cyclic reasoning capabilities. Experiments on WritingBench, LongGenBench, and CycleBench demonstrate that CycleGen outperforms similarly sized models and achieves performance rivaling significantly larger models. We have released our code and CycleBench at the anonymous repository: https://anonymous.4open.science/r/lucky0106-216F
Paper Type: Long
Research Area: AI/LLM Agents
Research Area Keywords: fine-tuning,reinforcement learning,LLM agents
Contribution Types: NLP engineering experiment
Languages Studied: English,Chinese
Submission Number: 835
Loading