Minimax Tree of Thoughts: Playing Two-Player Zero-Sum Sequential Games with Large Language Models

Published: 18 Jun 2024, Last Modified: 26 Jul 2024ICML 2024 Workshop on LLMs and Cognition PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Machine Learning, Two-Player Zero-Sum Games, Sequential Games, Planning, Large Language Models
Abstract: Large language models are being used to solve an increasing number of tasks, but the existing methods based on large language models are still not good enough in playing two-player zero-sum sequential games. In order to solve the related challenges of large language models playing two-player zero-sum sequential games, we propose Minimax Tree of Thoughts, which combines the idea of Tree of Thoughts and minimax search. Experiment results show that our Minimax Tree of Thoughts method significantly outperforms the original Tree of Thoughts method in two-player zero-sum sequential games tasks such as Word chain and game of Ghost.
Submission Number: 1
Loading