Renamer: A Transformer Architecture Invariant to Variable Renaming

NeurIPS 2023 Workshop MLSys Submission36 Authors

Published: 28 Oct 2023, Last Modified: 12 Dec 2023MlSys Workshop NeurIPS 2023 PosterEveryoneRevisionsBibTeX
Keywords: Transformers, Variable Renaming, Neural Surrogate, Invariant, Code Models
Abstract: Many modeling tasks involve learning functions which are invariant to certain types of input transformations. We study a specific class of invariance: semantics- preserving variable renaming for models of code. We show that vanilla Transformers trained on renaming-invariant tasks do not exhibit renaming invariance. We propose Renamer, a Transformer architecture which is itself invariant to semantics- preserving variable renaming. On a CPU simulation task, Renamer reduces error by between 24.79% and 52.8% compared to a vanilla Transformer.
Submission Number: 36
Loading