TY - GEN
T1 - On the Impact of Tool Evolution and Case Study Size on SBSE Experiments
T2 - 15th International Symposium on Search-Based Software Engineering, SSBSE 2023
AU - Golmohammadi, Amid
AU - Zhang, Man
AU - Arcuri, Andrea
N1 - Publisher Copyright:
© 2024, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2024
Y1 - 2024
N2 - In the dynamic landscape of Search-Based Software Engineering (SBSE), tools and algorithms are continually improved, possibly making past experimental insights outdated. This could happen if a newly designed technique has side-effects compared to techniques and parameters settings studied in previous work. Re-tuning all possible parameters in a SBSE tool at each new scientific study would not be viable, as too expensive and too time consuming, considering there could be hundreds of them. In this paper, we carried out a series of experiments to study the impact that such re-tuning could have. For such a study, we chose the SBSE tool EvoMaster. It is an open-source tool for automated test generation for REST APIs. It has been actively developed for over six years, since November 2016, making it an appropriate choice for this kind of studies. In these experiments, we replicated four previous studies of EvoMaster with 15 REST APIs as case studies, using its latest version. Our findings reveal that updated parameter settings can offer improved performance, underscoring the possible benefits of re-tuning already existing parameters. Additionally, the inclusion of a broader range of case studies provides support for the replicated study’s outcomes compared to the original studies, enhancing their external validity.
AB - In the dynamic landscape of Search-Based Software Engineering (SBSE), tools and algorithms are continually improved, possibly making past experimental insights outdated. This could happen if a newly designed technique has side-effects compared to techniques and parameters settings studied in previous work. Re-tuning all possible parameters in a SBSE tool at each new scientific study would not be viable, as too expensive and too time consuming, considering there could be hundreds of them. In this paper, we carried out a series of experiments to study the impact that such re-tuning could have. For such a study, we chose the SBSE tool EvoMaster. It is an open-source tool for automated test generation for REST APIs. It has been actively developed for over six years, since November 2016, making it an appropriate choice for this kind of studies. In these experiments, we replicated four previous studies of EvoMaster with 15 REST APIs as case studies, using its latest version. Our findings reveal that updated parameter settings can offer improved performance, underscoring the possible benefits of re-tuning already existing parameters. Additionally, the inclusion of a broader range of case studies provides support for the replicated study’s outcomes compared to the original studies, enhancing their external validity.
KW - Parameter Tuning
KW - RESTful APIs
KW - Replicating Studies
KW - SBST
KW - White-Box Test Generation
UR - https://www.scopus.com/pages/publications/85180537114
U2 - 10.1007/978-3-031-48796-5_8
DO - 10.1007/978-3-031-48796-5_8
M3 - 会议稿件
AN - SCOPUS:85180537114
SN - 9783031487958
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 108
EP - 122
BT - Search-Based Software Engineering - 15th International Symposium, SSBSE 2023, Proceedings
A2 - Arcaini, Paolo
A2 - Yue, Tao
A2 - Fredericks, Erik M.
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 8 December 2023 through 8 December 2023
ER -