Do Large Language Models have Lateral Thinking in Puzzle-Solving Games?

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Large Language Models, Lateral Thinking, Puzzle-Solving Games
TL;DR: Evaluation and Enhancement of Lateral Thinking in Puzzle-Solving Games of Large Language Models.
Abstract: Large Language Models (LLMs) show exceptional skills in a wide range of tasks, with their ability in lateral thinking standing out as a particularly intriguing area. Lateral thinking in LLMs allows them to understand deeper or suggested meanings from the context, which is essential for making sense of complex scenarios, especially in puzzle-solving games. To delve deeper into and improve the lateral thinking capabilities of LLMs in the realm of puzzle-solving, we introduce the ``Lateral Thinking Puzzles'' and construct the accompanying dataset. Our novel $\mathcal{P}$uzzle$\mathcal{V}$erse framework aims to enhance LLMs' lateral thinking in puzzle-solving games. Complementing this, we propose a creativity metric to ensure comprehensive evaluations. Experiments show that the selected LLMs, after being trained with $\mathcal{P}$uzzle$\mathcal{V}$erse, have an average improvement of 101.9\% compared to their performance before $\mathcal{P}$uzzle$\mathcal{V}$erse training among all metrics. We also validate the robustness of $\mathcal{P}$uzzle$\mathcal{V}$erse that trained LLMs perform better in other reasoning tasks.
Primary Area: datasets and benchmarks
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9702
Loading