Counting Like Transformers: Compiling Temporal Counting Logic Into Softmax Transformers

Published: 10 Jul 2024, Last Modified: 26 Aug 2024COLMEveryoneRevisionsBibTeXCC BY 4.0
Research Area: Science of LMs
Keywords: Transformer Encoders, Transformer Decoders, Formal Expressivity, Formal Language Theory, RASP, Temporal Logic, Counting Logic,
TL;DR: We identify the strongest known logic that the standard softmax transformer can simulate, and explain how to use it to hand-code algorithms into transformers with unbounded input length
Abstract: Deriving formal bounds on the expressivity of transformers, as well as studying transformers that are constructed to implement known algorithms, are both effective methods for better understanding the computational power of transformers. Towards both ends, we introduce the temporal counting logic $\textbf{K}_t$[#] alongside the RASP variant $\textbf{C-RASP}$. We show they are equivalent to each other, and that together they are the best-known lower bound on the formal expressivity of future-masked soft attention transformers with unbounded input size. We prove this by showing all $\textbf{K}_t$[#] formulas can be compiled into these transformers without any additional positional embeddings.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the COLM Code of Ethics on https://colmweb.org/CoE.html
Author Guide: I certify that this submission complies with the submission instructions as described on https://colmweb.org/AuthorGuide.html
Submission Number: 1268
Loading