Title
|
|
|
|
Clone detection in test code : an empirical evaluation
| |
Author
|
|
|
|
| |
Abstract
|
|
|
|
Duplicated test code (a.k.a. test code clones) has a negative impact on test comprehension and maintenance. Moreover, the typical structure of unit test code induces structural similarity, increasing the amount of duplication. Yet, most research on software clones and clone detection tools is focused on production code, often ignoring test code. In this paper we fill this gap by comparing four different clone detection tools (NiCad, CPD, iClones, TCORE) against the test code of three open-source projects. Our analysis confirms the prevalence of test code clones, as we observed between 23% and 29% test code duplication. We also show that most of the tools suffer from false negatives (NiCad = 83%, CPD = 84%, iClones = 21%, TCORE = 65%), which leaves ample room for improvement. These results indicate that further research on test clone detection is warranted. |
| |
Language
|
|
|
|
English
| |
Source (journal)
|
|
|
|
PROCEEDINGS OF THE 2020 IEEE 27TH INTERNATIONAL CONFERENCE ON SOFTWARE ANALYSIS, EVOLUTION, AND REENGINEERING (SANER '20)
| |
Source (book)
|
|
|
|
27th IEEE International Conference on Software Analysis, Evolution, and, Reengineering (SANER), FEB 18-21, 2020, London, CANADA
| |
Publication
|
|
|
|
New york
:
Ieee
,
2020
| |
ISBN
|
|
|
|
978-1-72815-143-4
| |
DOI
|
|
|
|
10.1109/SANER48275.2020.9054798
| |
Volume/pages
|
|
|
|
(2020)
, p. 492-500
| |
ISI
|
|
|
|
000568240800044
| |
Full text (Publisher's DOI)
|
|
|
|
| |
Full text (publisher's version - intranet only)
|
|
|
|
| |
|