Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Comprehensibility of system models during test design: A controlled experiment comparing UML activity diagrams and state machines
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering. Blekinge Institute of Technology.ORCID iD: 0000-0003-3818-4442
Herrmann & Ehrlich, DEU.
2018 (English)In: Software quality journal, ISSN 0963-9314, E-ISSN 1573-1367, p. 1-23Article in journal (Refereed) Epub ahead of print
Abstract [en]

UML activity diagrams and state machines are both used for modeling system behavior from the user perspective and are frequently the basis for deriving system test cases. In practice, system test cases are often derived manually from UML activity diagrams or state machines. For this task, comprehensibility of respective models is essential and a relevant question for practice to support model selection and design, as well as subsequent test derivation. Therefore, the objective of this paper is to compare the comprehensibility of UML activity diagrams and state machines during manual test case derivation. We investigate the comprehensibility of UML activity diagrams and state machines in a controlled student experiment. Three measures for comprehensibility have been investigated: (1) the self-assessed comprehensibility, (2) the actual comprehensibility measured by the correctness of answers to comprehensibility questions, and (3) the number of errors made during test case derivation. The experiment was performed and internally replicated with overall 84 participants divided into three groups at two institutions. Our experiment indicates that activity diagrams are more comprehensible but also more error-prone with regard to manual test case derivation and discusses how these results can improve system modeling and test case design.

Place, publisher, year, edition, pages
Springer, 2018. p. 1-23
Keywords [en]
UML models, System testing, System models, Test design, Model comprehensibility, Controlled experiment
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-16116DOI: 10.1007/s11219-018-9407-9OAI: oai:DiVA.org:bth-16116DiVA, id: diva2:1201151
Note

open access

Available from: 2018-04-24 Created: 2018-04-24 Last updated: 2018-05-04Bibliographically approved

Open Access in DiVA

fulltext(766 kB)18 downloads
File information
File name FULLTEXT01.pdfFile size 766 kBChecksum SHA-512
86f5884e9d7ab809cdda6587c01de46601521150b0b5e29be0e1aff3972095349a292cfcfd8dc311945235c952c19b2fdccb947704807a57d5d11dc97de83636
Type fulltextMimetype application/pdf

Other links

Publisher's full textComprehensibility of system models during test design: a controlled experiment comparing UML activity diagrams and state machines

Authority records BETA

Felderer, Michael

Search in DiVA

By author/editor
Felderer, Michael
By organisation
Department of Software Engineering
In the same journal
Software quality journal
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 18 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 98 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf