Graph Recurrent Attention Networks for Solving Satisfiability Problems

20 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Boolean Satisfiability, Graph Neural Networks, Graph Attention, Recurrent Neural Networks, T-conorm
Abstract: In recent years, the use of deep learning for solving Boolean Satisfiability (SAT) problems has gained significant interest. This paper advances such neural-based methods by introducing a **G**raph **r**ecurrent **a**ttention **n**etwork for **SAT** (GranSAT). GranSAT employs two innovative steps to guide the network to search towards satisfaction of clauses: (1) evaluating the truth degree of each clause based on t-conorm fuzzy logic operators, and (2) updating assignments with attention mechanisms, closely aligning with distributed local search methods. Logical states are coupled with recurrently updated hidden states that are used to compute attention values, allowing the model to refine fuzzy assignments while retaining information from previous updates. Experimental results on crafted and random SAT benchmarks demonstrate that GranSAT outperforms existing neural SAT solvers in both performance and generalization. Furthermore, when combined with local search post-processors, GranSAT achieves state-of-the-art performance on random instances, showcasing its effectiveness in solving SAT problems.
Supplementary Material: pdf
Primary Area: neurosymbolic & hybrid AI systems (physics-informed, logic & formal reasoning, etc.)
Submission Number: 23787
Loading