\textsc{ISagwR}: Iterative Self-augmented Generation with ReviewerDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
Abstract: Code generation plays a vital role in software development and has gained widespread attention. Some researchers prone to employ Retrieval-augmented Generation (\textsc{Rag}) and achieved impressive results. However, these methods ignore the real-world iterative code refining process as they solely reuse external retrieved code. To tackle this limitation, we propose a self-augmented generation method \textsc{Sag}, which iteratively constructs augmented datasets using Generator's output. The Generator refine its own code with the help of the datasets. Furthermore, inspired by the real-world role of programmer reviewers, we propose an iterative generator-review architectural method \textsc{ISagwR} based on the \textsc{Sag} datasets. As its core, a Reviewer module is employed to detect and handle errors. These feedback are then feed into Generator for better coding output. We conduct extensive experiments on five benchmarks, and the results show that \textsc{ISagwR} significantly surpasses all the baselines. The results also indicate that the \textsc{Sag} datasets and the Reviewer module respectively provides valuable insight to perform automatic data augmentation and integrate self-correct ability into a unified framework.
Paper Type: long
Research Area: Machine Learning for NLP
Contribution Types: NLP engineering experiment, Publicly available software and/or pre-trained models, Data resources
Languages Studied: English
0 Replies

Loading