GenFlow: Constrained Long-Form Text Generation via Adaptive Workflow Optimization

17 Sept 2025 (modified: 12 Feb 2026)ICLR 2026 Conference Desk Rejected SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: LLM, Constrained Long-Form Text Generation, Agentic WorkFlow
TL;DR: GenFlow is an adaptive framework for constrained long-form text generation that outperforms GPT-4o-mini and CogWriter in constraint satisfaction, coherence, and quality.
Abstract: Large Language Models (LLMs) exhibit strong abilities in generating coherent human-like text, yet producing long-form content that satisfies complex constraints remains challenging. Existing approaches either extend generation length through large curated datasets, as in LongWriter, or structure outputs via cognitive-inspired hierarchical planning, as in CogWriter, but often struggle to balance coherence, semantic fidelity, and explicit requirements. In this work, we propose GenFlow, an adaptive framework for constrained long-form text generation. It decomposes writing objectives into constraint-aware sub-plans, uses adaptive decision-making and reward filtering to retain high-quality plans, and optimizes both local and global generation. By embedding constraints directly into the workflow, GenFlow ensures consistency while adapting to evolving requirements. Experimental results on the Qwen2.5 series demonstrate that GenFlow outperforms GPT-4o-mini and CogWriter baselines in constraint satisfaction, coherence, and overall quality.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 9467
Loading