Natural Language to Code Using TransformersDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 15 Feb 2024CoRR 2022Readers: Everyone
Abstract: We tackle the problem of generating code snippets from natural language descriptions using the CoNaLa dataset. We use the self-attention based transformer architecture and show that it performs better than recurrent attention-based encoder decoder. Furthermore, we develop a modified form of back translation and use cycle consistent losses to train the model in an end-to-end fashion. We achieve a BLEU score of 16.99 beating the previously reported baseline of the CoNaLa challenge.
0 Replies

Loading