Dynamic Collaboration of Multi-Language Models based on Minimal Complete Semantic Units

ACL ARR 2025 May Submission3174 Authors

19 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: This paper investigates the enhancement of reasoning capabilities in language models through token-level multi-model collaboration. Our approach selects the optimal tokens from the next token distributions provided by multiple models to perform autoregressive reasoning. Contrary to the assumption that more models yield better results, we introduce a distribution distance-based dynamic selection strategy (DDS) to optimize the multi-model collaboration process. To address the critical challenge of vocabulary misalignment in multi-model collaboration, we propose the concept of minimal complete semantic units (MCSU), which is simple yet enables multiple language models to achieve natural alignment within the linguistic space. Experimental results across various benchmarks demonstrate the superiority of our method. The codes will be released soon.
Paper Type: Long
Research Area: Language Modeling
Research Area Keywords: Generation, Language Modeling, Question Answering
Contribution Types: Model analysis & interpretability, NLP engineering experiment
Languages Studied: English, Chinese
Keywords: Large Language Model, Dynamic Collaboration, Minimal Complete Semantic Units
Submission Number: 3174
Loading