Entropy Variation and Information Competence: Enhancing Predictive Accuracy of Collaborative Language Models

ACL ARR 2024 June Submission38 Authors

04 Jun 2024 (modified: 07 Aug 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: This paper introduces EVINCE (Entropy Variation and INformation CompetencE), a cutting-edge dialogue framework that orchestrates adversarial debates and collaborative insights among multiple large language models (LLMs). By integrating advanced principles from conditional statistics, information theory, and in-context learning, EVINCE masterfully balances the exploration of diverse perspectives with the exploitation of established priors. Central to our innovation is the validation of the dual entropy theory, which we developed to determine the optimal pairing of LLMs with one high and one low entropy for enhanced probabilistic prediction accuracy. We also employ several information-theoretic metrics, such as mutual information, cross-entropy, Wasserstein distance, and Jensen-Shannon divergence, to measure communication opportunities, dialogue progress, and convergence. This meticulous approach fosters an interpretable and productive multi-LLM dialogue, leading to more informed and reliable outcomes. We illustrate EVINCE's potential by applying it to healthcare, demonstrating its effectiveness in improving disease diagnosis, and discuss its broader implications for enhancing decision-making across various domains.
Paper Type: Long
Research Area: Dialogue and Interactive Systems
Research Area Keywords: Dual entropy, Mutual Information, Conditional Statistics
Contribution Types: Model analysis & interpretability, Position papers, Theory
Languages Studied: English
Submission Number: 38
Loading