The Transformer Cookbook

TMLR Paper6052 Authors

30 Sept 2025 (modified: 21 Nov 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We present the transformer cookbook: a collection of techniques for directly encoding algorithms into a transformer's parameters. This work addresses the steep learning curve of such endeavors, a problem exacerbated by a fragmented literature where key results are scattered across numerous papers. In particular, we synthesize this disparate body of findings into a curated set of recipes that demonstrate how to implement everything from basic arithmetic in feed-forward layers to complex data routing via self-attention. Our mise en place of formulations is for both newcomers seeking an accessible entry point and experts in need of a systematic reference. This unified presentation of transformer constructions provides a foundation for future work spanning theoretical research in computational complexity to empirical investigations in architecture design and interpretability.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: # List of Changes - expand exposition around transformer architecture in section 2.2 - Add in the ReLU / GELU approximation behavior in section 4.6 - fixed typos - added citation to RASP/Tracr in the introduction - added remark about architectural assumptions into the introduction - fixed an error in section 5.7 - added comment about temperature scaling into section 9.1.1 - updated table 2 and created table 3
Assigned Action Editor: ~Tim_Genewein1
Submission Number: 6052
Loading