Abstract: Due to the complexity of linear temporal logic (LTL) trace generation (PSPACE-Complete), existing neural network-based approaches will fail as the formula sizes increase. Recently, large language models (LLMs) have demonstrated remarkable reasoning capabilities, benefiting from efficient training on hyper-scale data. Inspired by this, we propose an iterative interaction framework for applying LLMs, exemplified by ChatGPT, to generate a trace satisfying a given LTL formula. The key insight behind it is to transfer the powerful reasoning capabilities of LLM to LTL trace generation via iterative interaction between LLM reasoning and logical reasoning. Preliminary results show that compared with the state-of-the-art approach, the accuracy is relatively improved by 9.7%-23.4%. Besides, we show that our framework is able to produce heuristics for new tasks, which provides a reference for other reasoning-heavy tasks requiring heuristics.
Loading