Keywords: transformer, compositionality, systematic generalization, algorithmic reasoning, arithmetic
TL;DR: We improve systematic generalization of Transformers on algorithmic tasks by introducing a novel attention mechanism and gating.
Abstract: Transformers have limited success in systematic generalization. The situation is especially frustrating in the case of algorithmic tasks, where they often fail to find intuitive solutions that route relevant information to the right node/operation at the right time in the grid represented by Transformer columns. To facilitate the learning of useful control flow, we propose two modifications to the Transformer architecture, copy gate and geometric attention. Our novel Neural Data Router (NDR) achieves 100% length generalization accuracy on the compositional table lookup task. NDR’s attention and gating patterns tend to be interpretable as an intuitive form of neural routing.