Thinking Like Transformers

Published: 02 May 2023, Last Modified: 12 Mar 2024Blogposts @ ICLR 2023Readers: Everyone
Keywords: Transformers
Abstract: Thinking like Transformers proposes a computational framework for Transformer-like calculations. The framework uses discrete computation to simulate Transformer computations. The resulting language RASP is a programming language where every program compiles down to a specific Transformer. In this blog post, we reimplement a variant of RASP in Python (RASPy). The language is roughly compatible with the original version, but with some syntactic changes for simplicity. With this language, we consider a challenging set of puzzles to walk through and understand how it works.
Blogpost Url: https://iclr-blogposts.github.io/2023/blog/2023/raspy/
ICLR Papers: https://arxiv.org/abs/2106.06981
ID Of The Authors Of The ICLR Paper: ~Gail_Weiss1
Conflict Of Interest: No
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2106.06981/code)
5 Replies

Loading