Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Evidence and perceptions on GUI test automation: An explorative Multi-Case study
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
2017 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Context. GUI-based automation testing is a costly and tedious activity in practice. As GUIs are well-known for being modified and redesigned throughout the development process, the corresponding test scripts are not valid anymore thereby being a hindrance to automation. Hence, substantial effort is invested in maintaining GUI test scripts which often leads to rework or waste due to improper decisions. As a result, practitioners have identified the need for decision support regarding when should GUI automation testing begin and how to make it easier and also identify what are the factors leading to waste in GUI-based automation testing. The current literature provides solutions relating to automation in general and few answers for GUI based-automation testing. Such generic answers might not be applicable to GUI test automation and also industries new to GUI development and testing. Thus, it is necessary to validate if the general solutions are applicable to GUI test automation and find additional answers that are not identified previously from practitioners opinions in an industrial context.

Objectives. Capture relevant information regarding the current approach for GUI test automation within the subsystems from a case company. Next, identify the criteria for when to begin automation, testability requirements and factors associated with waste from literature and practice.

Methods. We conducted a multiple-case study to explore opinions of practitioners in two subsystems at a Swedish telecommunication industry implementing GUI-automation testing. We conducted a literature review to identify answers from scientific literature prior to performing a case study.A two-phased interview was performed with different employees to collect their subjective opinions and also gather their opinions on the evidence collected from the literature. Later, Bayesian synthesis method was used to combine subjective opinions of practitioners with research-based evidence to produce context-specific results.

Results. We identified 12 criteria for when to begin automation, 16 testability requirements and 15 factors associated with waste in GUI test automation.Each of them is classified into categories namely SUT-related,test-process related, test-tool related, human and organizational, environment and cross-cutting. New answers which were not present in the existing literature in the domain of the research are found.

Conclusions. On validating the answers found in literature, it was revealed that the answers applicable for software test automation, in general, are valid for GUI automation testing as well. Since we incorporated subjective opinions to produce context specific results, we gained an understanding that every practitioner has their own way of working. Hence, this study aids in developing a common understanding to support informed subjective decisions based on evidence.

Place, publisher, year, edition, pages
2017. , p. 101
Keywords [en]
GUI test automation, when to automate, testability requirements, automation waste, maintenance
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-15455OAI: oai:DiVA.org:bth-15455DiVA, id: diva2:1155511
Subject / course
PA2534 Master's Thesis (120 credits) in Software Engineering
Educational program
PAAXA Master of Science Programme in Software Engineering
Supervisors
Examiners
Available from: 2017-11-09 Created: 2017-11-08 Last updated: 2018-01-13Bibliographically approved

Open Access in DiVA

fulltext(3762 kB)1165 downloads
File information
File name FULLTEXT02.pdfFile size 3762 kBChecksum SHA-512
dc2377104290d2696e01f76334dbcd235aaedafe018af2201aed18324fa8b36a0993745e98836374e5b509ee0b6a61ccd32ce61b88e075350fd453fe6d4488f4
Type fulltextMimetype application/pdf

By organisation
Department of Software Engineering
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 1165 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 5080 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf