Are Transformers All That Karel Needs?Download PDF

Published: 23 Oct 2021, Last Modified: 05 May 2023AIPLANSReaders: Everyone
Keywords: program synthesis, transformer, karel, execution guidance
TL;DR: Transformer based models achieve a significant improvement over LSTMs in program synthesis on Karel dataset
Abstract: Recent works have shown the promise of using neural networks for the task of program synthesis from input-output examples. The Karel dataset has been a benchmark for evaluating program synthesis approaches. Several techniques have been proposed to use neural guided program synthesis with Karel being used as a baseline. Most of these techniques use an LSTM based model for decoding and improve performance by proposing complex algorithmic additions, such as using inferred execution traces, latent execution of partial programs and debugging generated programs. We observe that by changing the base architecture to a transformer based one, specifically GPT2, we are able to apply simple execution guidance on top to achieve a generalization accurary of 89.64%, which is within 2.36 percentage points of the current state-of-the-art on Karel which uses ensembling.
1 Reply

Loading