Mastering Symbolic Operations: Augmenting Language Models with Compiled Neural Networks

03 May 2023 (modified: 31 Jan 2024)Submitted to NeurIPS 2023EveryoneRevisionsBibTeX
Keywords: Language Models, Compiled Neural Networks, Neural Comprehension, Symbolic Operations, Length Generalization
TL;DR: We have enabled language models to more fundamental comprehension of the rule, to achieve completely absolute accuracy in symbolic operations without additional tools.
Abstract: Language models (LMs) proficiency in handling deterministic symbolic reasoning and rule-based tasks remains limited due to their dependency implicit learning on textual data. To enable fully rule comprehension ability, we explore how to incorporate compiled neural networks (CoNNs) which weight is specially designed into the architecture of LMs, to achieve high accuracy and robust performance. CoNNs are transformer-based neural networks that execute rules through artificially generated attention weights. Our method, which call "Neural Comprehension", by incorporating CoNN modules into the LM, the framework effectively tackles rule-intensive challenges. Our experiments on symbolic reasoning tasks and real-world arithmetic reasoning tasks demonstrate the superior performance of our method compared to existing techniques. Furthermore, our LM achieves flawless execution on symbolic operations tasks, highlighting the potential of our method in enabling LMs to possess true symbolic comprehension capabilities.
Supplementary Material: zip
Submission Number: 1514
Loading