Paper
AutoTransform: Automated Code Transformation to Support Modern Code Review Process
Published May 1, 2022 · Patanamon Thongtanunam, Chanathip Pornprasit, C. Tantithamthavorn
2022 IEEE/ACM 44th International Conference on Software Engineering (ICSE)
60
Citations
1
Influential Citations
Abstract
Code review is effective, but human-intensive (e.g., developers need to manually modify source code until it is approved). Recently, prior work proposed a Neural Machine Translation (NMT) approach to automatically transform source code to the version that is reviewed and approved (i.e., the after version). Yet, its performance is still suboptimal when the after version has new identifiers or literals (e.g., renamed variables) or has many code tokens. To address these limitations, we propose AutoTransform which leverages a Byte-Pair Encoding (BPE) approach to handle new tokens and a Transformer-based NMT architecture to handle long sequences. We evaluate our approach based on 14,750 changed methods with and without new tokens for both small and medium sizes. The results show that when generating one candidate for the after version (i.e., beam width = 1), our AUTOTRANSFORM can correctly transform 1,413 changed methods, which is 567% higher than the prior work, highlighting the substantial improvement of our approach for code transformation in the context of code review. This work contributes towards automated code transformation for code reviews, which could help developers reduce their effort in modifying source code during the code review process.
AutoTransform significantly improves code transformation for code reviews, transforming 1,413 changed methods with 567% higher accuracy than prior work.
Full text analysis coming soon...