Abstract: State-space graphs and automata serve as fundamental tools for modeling and analyzing the behavior of computational systems. Recurrent neural networks (RNNs) and language models are deeply intertwined, as RNNS provide the foundational architecture that enables language models to process sequential data, capture contextual dependencies, and improve natural language processing tasks. Both RNNs and state-space graphs help evaluate discrete-time systems within this formal framework. However, the basic question of their equivalence remains an open challenge regarding the models governing sentence structure in natural language and the formal model in automata theory. In this paper, we present ENGRU (Enhanced Gated Recurrent Units), a novel deep learning-based approach for formal verification. ENGRU integrates concepts of model checking techniques, Colored Petri Nets (CPNs), and sequential learning to analyze systems at a high level of abstraction. CPN models undergo state-space enumeration via a model-checking tool, generating a state-space graph and an automaton from inherent state transition patterns. These graphs are transformed into sequential representations as sub-paths, enabling ENGRU to learn the execution paths and predict system behaviors. ENGRU effectively predicts goal states within discrete-time models by the competency of gated recurrent mechanisms, encouraging early bug detection and allowing predictive state-space exploration. Experimental results demonstrate that ENGRU’s ability achieves high accuracy and efficiency in goal state predictions. The source code for ENGRU is available at (https://github.com/kaopanboonyuen/ENGRU).
Loading