Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Preliminary Evaluation of a Survey Checklist in the Context of Evidence-based Software Engineering Education
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0002-1532-8223
Simula Metropolitan Ctr Digital Engn, NOR.
2021 (English)In: ENASE: PROCEEDINGS OF THE 16TH INTERNATIONAL CONFERENCE ON EVALUATION OF NOVEL APPROACHES TO SOFTWARE ENGINEERING / [ed] Ali, R, Kaindl, H, Maciaszek, L, SciTePress, 2021, no 16th International Conference on Evaluation of Novel Approaches to Software Engineering (ENASE), p. 437-444Conference paper, Published paper (Refereed)
Abstract [en]

Background: In order to judge evidence it is important to be able to assess study quality. Checklists are means to objectify the assessment. In an earlier study we proposed and evaluated a checklist for surveys, which was assessed by experts. Objective: (1) To assess whether the use of the checklist enables students with limited experience in research to consistently and accurately assess the quality of a research paper. (2) To elicit qualitative feedback to identify improvements to the checklist. Method: The students reviewed a survey in a one-group posttest-only quasi-experiment using the checklist. In total 13 students participated in the context of the course Evidence-based software engineering as part of the study program Information Systems at Flensburg University of Applied Sciences. Results: In total the students achieved 74% percent of agreement among each other. However, the Kappa values indicated mostly a poor level of agreement considering the classification by Fleiss. In addition, the students were quite inaccurate assessing the questions. Though. they performed well on questions for research objectives and the identification of population. Conclusion: Findings indicate that students do not assess reliably. However, further investigations are needed to substantiate the findings.

Place, publisher, year, edition, pages
SciTePress, 2021. no 16th International Conference on Evaluation of Novel Approaches to Software Engineering (ENASE), p. 437-444
Keywords [en]
Checklist, Survey, One-group Quasi-experiment, Students
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-22896DOI: 10.5220/0010496204370444ISI: 000783843700046ISBN: 978-989-758-508-1 OAI: oai:DiVA.org:bth-22896DiVA, id: diva2:1656547
Conference
16th International Conference on Evaluation of Novel Approaches to Software Engineering (ENASE), Virtual, Online, APR 26-27, 2021
Note

open access

Available from: 2022-05-06 Created: 2022-05-06 Last updated: 2022-05-06Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Petersen, Kai

Search in DiVA

By author/editor
Petersen, Kai
By organisation
Department of Software Engineering
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 42 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf