Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
An Empirically Evaluated Checklist for Surveys in Software Engineering
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0002-1532-8223
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering. Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
2020 (English)In: Information and Software Technology, ISSN 0950-5849, E-ISSN 1873-6025, article id 106240Article in journal (Refereed) Published
Abstract [en]

Context: Over the past decade Software Engineering research has seen a steady increase in survey-based studies, and there are several guidelines providing support for those willing to carry out surveys. The need for auditing survey research has been raised in the literature. Checklists have been used to assess different types of empirical studies, such as experiments and case studies.

Objective: This paper proposes a checklist to support the design and assessment of survey-based research in software engineering grounded in existing guidelines for survey research. We further evaluated the checklist in the research practice context.

Method: To construct the checklist, we systematically aggregated knowledge from 12 methodological studies supporting survey-based research in software engineering. We identified the key stages of the survey process and its recommended practices through thematic analysis and vote counting. To improve our initially designed checklist we evaluated it using a mixed evaluation approach involving experienced researchers.

Results: The evaluation provided insights regarding the limitations of the checklist in relation to its understanding and objectivity. In particular, 19 of the 38 checklist items were improved according to the feedback received from its evaluation. Finally, a discussion on how to use the checklist and what its implications are for research practice is also provided.

Conclusion: The proposed checklist is an instrument suitable for auditing survey reports as well as a support tool to guide ongoing research with regard to the survey design process.

Place, publisher, year, edition, pages
Elsevier, 2020. article id 106240
Keywords [en]
Checklist, Assessment, Survey, Methodology
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-17645DOI: 10.1016/j.infsof.2019.106240OAI: oai:DiVA.org:bth-17645DiVA, id: diva2:1292230
Available from: 2019-02-27 Created: 2019-02-27 Last updated: 2021-06-11Bibliographically approved
In thesis
1. Views of Research Quality in Empirical Software Engineering
Open this publication in new window or tab >>Views of Research Quality in Empirical Software Engineering
2019 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Background. Software Engineering (SE) research, like other applied disciplines, intends to provide trustful evidence to practice. To ensure trustful evidence, a rigorous research process based on sound research methodologies is required. Further, to be practically relevant, researchers rely on identifying original research problems that are of interest to industry; and the research must fulfill various quality standards that form the basis for the evaluation of the empirical research in SE. A dialogue and shared view of quality standards for research practice is still to be achieved within the research community.

 Objectives. The main objective of this thesis is to foster dialogue and capture different views of SE researchers on method level (e.g., through the identification and reasoning on the importance of quality characteristics for experiments, surveys and case studies) as well as general quality standards for Empirical Software Engineering (ESE). Given the views of research quality, a second objective is to understand how to operationalize, i.e. build and validate instruments to assess research quality. 

Method. The thesis makes use of a mixed method approach of both qualitative and quantitative nature. The research methods used were case studies, surveys, and focus groups. A range of data collection methods has been employed, such as literature review, questionnaires, and semi-structured workshops. To analyze the data, we utilized content and thematic analysis, descriptive and inferential statistics.

Results. We draw two distinct views of research quality. Through a top-down approach, we assessed and evolved a conceptual model of research quality within the ESE research community. Through a bottom-up approach, we built a checklist instrument for assessing survey-based research grounded on supporting literature and evaluated ours and others’ checklists in research practice and research education contexts.

Conclusion. The quality standards we identified and operationalized support and extend the current understanding of research quality for SE research. This is a preliminary, but still vital, step towards a shared understanding and view of research quality for ESE research. Further steps are needed to gain a shared understanding of research quality within the community. 

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2019
Series
Blekinge Institute of Technology Doctoral Dissertation Series, ISSN 1653-2090 ; 7
Keywords
Research Quality, Quality Standards, Empirical Software Engineering, Research Methodology
National Category
Software Engineering
Identifiers
urn:nbn:se:bth-17648 (URN)978-91-7295-372-7 (ISBN)
Public defence
2019-06-14, J1650, Campus Gräsvik, Karlskrona, 13:00 (English)
Opponent
Supervisors
Available from: 2019-03-05 Created: 2019-02-27 Last updated: 2019-05-09Bibliographically approved

Open Access in DiVA

fulltext(756 kB)276 downloads
File information
File name FULLTEXT01.pdfFile size 756 kBChecksum SHA-512
0d120f4ca2685cc439b3ad86ada477658425aa376848ed959cfb210456c20ee2b4d57cd029b949376b01aff7ada4d66f8a243d9144d8ccab43ddbb32f8e66e9b
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Authority records

Molléri, Jefferson SeidePetersen, KaiMendes, Emilia

Search in DiVA

By author/editor
Molléri, Jefferson SeidePetersen, KaiMendes, Emilia
By organisation
Department of Software EngineeringDepartment of Computer Science and Engineering
In the same journal
Information and Software Technology
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 276 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 629 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf