BUG FIX GENERATION USING GRAPHTRANSDownload PDF

Anonymous

04 Mar 2022 (modified: 05 May 2023)ICLR 2022 Workshop DL4C Blind SubmissionReaders: Everyone
Keywords: Program Repair, Graph Neural Networks, Transformer
TL;DR: We introduce FIXUR based on the GraphTrans architecture and achieve near the state-of-the-art results for code repair.
Abstract: Code repair, a task of learning to detect and fix bugs, is an important application of deep learning for source code. Previous work relies on code changes using Transformer models to represent code as sequences. However, code is naturally represented as a graph that encapsulates rich syntactic and semantic dependencies. Hence, generating edits for bug fixes requires both local structural information and global information. Inspired by GraphTrans (Wu et al., 2021), we propose FIXUR, a new architecture for generating bug fixing edits, by complementing graph neural networks with Transformer to encode the code graph. Our experiments show that FIXUR obtains near the state-of-the-art results on the code refinement benchmark, without relying on any large-scale pre-training. FIXUR achieves 20.50% and 11.01% top-1 accuracy on the small and medium datasets, respectively compared to 19.06% and 10.92% of CodeT5-small, which has a similar size.
1 Reply

Loading