A Failed attempt at creating Guidelines for Visual GUI Testing: An industrial case study
2021 (English)In: Proceedings - 2021 IEEE 14th International Conference on Software Testing, Verification and Validation, ICST 2021, Institute of Electrical and Electronics Engineers Inc. , 2021, p. 340-350, article id 9438551Conference paper, Published paper (Refereed)
Abstract [en]
Software development is governed by guidelines that aim to improve the code's qualities, such as maintainability. However, whilst coding guidelines are commonplace for software, guidelines for testware are much less common. In particular, for GUI-based tests driven with image recognition, also referred to as Visual GUI Testing (VGT), explicit coding guidelines are missing.In this industrial case study, performed at the Swedish defence contractor Saab AB, we propose a set of coding guidelines for VGT and evaluate their impact on test scripts for an industrial, safety-critical system. To study the guidelines' effect on maintenance costs, five representative manual test cases are each translated with and without the proposed guidelines in the two VGT tools SikuliX and EyeAutomate. As such, 20 test scripts were developed, with a combined development cost of more than 100 man-hours. Three of the tests are then maintained by one researcher and two practitioners for another version of the system and costs measured to evaluate return on investment. This analysis is complemented with observations and interviews to elicit practitioners' perceptions and experiences with VGT.Results show that scripts developed with the guidelines had higher maintenance costs than scripts developed without guidelines. This is supported by qualitative results that many of the guidelines are considered inappropriate, superfluous or unnecessary due to the inherent properties of the scripts, e.g. their natural small size, linear flows, natural separation of concerns, and more. We conclude that there are differences between VGT scripts and software that prohibit direct translation of guidelines between the two. As such, we consider our study as a failure but argue that several lessons can be drawn from our results to guide future research into guidelines for VGT and GUI-based test automation. © 2021 IEEE.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc. , 2021. p. 340-350, article id 9438551
Keywords [en]
Automated Testing, Case study, Guidelines for testing, Industrial study, Visual GUI testing, Accident prevention, Cost benefit analysis, Costs, Graphical user interfaces, Image coding, Image recognition, Maintenance, Safety testing, Software design, Verification, Defence contractors, Development costs, Industrial case study, Maintenance cost, Safety critical systems, Separation of concerns, Test Automation, Software testing
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-21820DOI: 10.1109/ICST49551.2021.00046ISI: 000680831800034Scopus ID: 2-s2.0-85107963867ISBN: 9781728168364 (print)OAI: oai:DiVA.org:bth-21820DiVA, id: diva2:1573451
Conference
14th IEEE International Conference on Software Testing, Verification and Validation, ICST 2021, 12 April 2021 through 16 April 2021
Part of project
SERT- Software Engineering ReThought, Knowledge FoundationM.E.T.A. – Modelling Efficient Test Architectures, Knowledge Foundation
Funder
Knowledge Foundation, 20180010, 20180102
Note
open access
2021-06-242021-06-242021-11-26Bibliographically approved