Keywords: compilers, compilation, machine translation, transformer, programming languages, code translation
TL;DR: We train and evaluate a sequence-to-sequence Transformer to output x86 assembly from C code
Abstract: Deep learning has had a significant impact on many fields. Recently, code-to-code neural models have been used in code translation, code refinement and decompilation. However, the question of whether these models can automate compilation has yet to be investigated. In this work, we explore neural compilation, building and evaluating Transformer models that learn how to produce x86 assembler from C code.
Although preliminary results are relatively weak, we make our data, models and code publicly available to encourage further research in this area.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/learning-c-to-x86-translation-an-experiment/code)
1 Reply
Loading