Keywords: Developmental Interpretability, Copy Suppression, Head Specialization
TL;DR: List sorting transformer trained on list sorting showing head specialization and copy-suppression.
Abstract: We present an analysis of the evolution of the QK and OV circuits for a list sorting attention only transformer. Using various measures, we identify the developmental stages in the training process. In particular, we find two forms of head specialization later in the training: vocabulary-splitting and copy-suppression. We study their robustness by varying the training hyperparameters and the model architecture.
Submission Number: 26
Loading