Not logged in.

Contribution Details

Type Conference or Workshop Paper
Scope Discipline-based scholarship
Published in Proceedings Yes
Title EvaCRC: Evaluating Code Review Comments
Presentation Type paper
Item Subtype Original Work
Refereed No
Status Published in final form
ISBN 979-8-4007-0327-0
Page Range 275 - 287
Event Title ESEC/FSE '23: 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering
Event Type conference
Event Location San Francisco CA USA
Event Start Date December 3 - 2023
Event End Date December 9 - 2023
Publisher Association for Computing Machinery
Abstract Text In code reviews, developers examine code changes authored by peers and provide feedback through comments. Despite the importance of these comments, no accepted approach currently exists for assessing their quality. Therefore, this study has two main objectives: (1) to devise a conceptual model for an explainable evaluation of review comment quality, and (2) to develop models for the automated evaluation of comments according to the conceptual model. To do so, we conduct mixed-method studies and propose a new approach: EvaCRC (Evaluating Code Review Comments). To achieve the first goal, we collect and synthesize quality attributes of review comments, by triangulating data from both authoritative documentation on code review standards and academic literature. We then validate these attributes using real-world instances. Finally, we establish mappings between quality attributes and grades by inquiring domain experts, thus defining our final explainable conceptual model. To achieve the second goal, EvaCRC leverages multi-label learning. To evaluate and refine EvaCRC, we conduct an industrial case study with a global ICT enterprise. The results indicate that EvaCRC can effectively evaluate review comments while offering reasons for the grades.
Digital Object Identifier 10.1145/3611643.3616245
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)