Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Teaching students critical appraisal of scientific literature using checklists
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0001-7266-5632
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0002-1532-8223
Show others and affiliations
2018 (English)In: PROCEEDINGS OF THE 3RD EUROPEAN CONFERENCE OF SOFTWARE ENGINEERING EDUCATION (ECSEE), Association for Computing Machinery , 2018, p. 8-17Conference paper, Published paper (Refereed)
Abstract [en]

Background: Teaching students to critically appraise scientific literature is an important goal for a postgraduate research methods course. Objective: To investigate the application of checklists for assessing the scientific rigor of empirical studies support students in reviewing case study research and experiments. Methods:We employed an experimental design where 76 students (in pairs) used two checklists to evaluate two papers (reporting a case study and an experiment) each. We compared the students' assessments against ratings from more senior researchers. We also collected data on students' perception of using the checklists. Results: The consistency of students' ratings and the accuracy when compared to ratings from seniors varied. A factor seemed to be that the clearer the reporting, the easier it is for students to judge the quality of studies. Students perceived checklist items related to data analysis as difficult to assess. Conclusion: As expected, this study reinforces the needs for clear reporting, as it is important that authors write to enable synthesis and quality assessment. With clearer reporting, the novices performed well in assessing the quality of the empirical work, which supports its continued use in the course as means for introducing scientific reviews. © 2018 Association for Computing Machinery.

Place, publisher, year, edition, pages
Association for Computing Machinery , 2018. p. 8-17
Keywords [en]
Case study, Checklist, Critical appraisal, Experiment, Student, Design of experiments, Engineering education, Experiments, Software engineering, Teaching, Case study research, Continued use, Empirical studies, Post-graduate research, Quality assessment, Scientific literature, Students
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-16892DOI: 10.1145/3209087.3209099ISI: 000478670000002Scopus ID: 2-s2.0-85049867400ISBN: 9781450363839 OAI: oai:DiVA.org:bth-16892DiVA, id: diva2:1239991
Conference
3rd European Conference of Software Engineering Education, ECSEE, Seeon Monastery, Germany
Available from: 2018-08-20 Created: 2018-08-20 Last updated: 2021-06-11Bibliographically approved
In thesis
1. Views of Research Quality in Empirical Software Engineering
Open this publication in new window or tab >>Views of Research Quality in Empirical Software Engineering
2019 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Background. Software Engineering (SE) research, like other applied disciplines, intends to provide trustful evidence to practice. To ensure trustful evidence, a rigorous research process based on sound research methodologies is required. Further, to be practically relevant, researchers rely on identifying original research problems that are of interest to industry; and the research must fulfill various quality standards that form the basis for the evaluation of the empirical research in SE. A dialogue and shared view of quality standards for research practice is still to be achieved within the research community.

 Objectives. The main objective of this thesis is to foster dialogue and capture different views of SE researchers on method level (e.g., through the identification and reasoning on the importance of quality characteristics for experiments, surveys and case studies) as well as general quality standards for Empirical Software Engineering (ESE). Given the views of research quality, a second objective is to understand how to operationalize, i.e. build and validate instruments to assess research quality. 

Method. The thesis makes use of a mixed method approach of both qualitative and quantitative nature. The research methods used were case studies, surveys, and focus groups. A range of data collection methods has been employed, such as literature review, questionnaires, and semi-structured workshops. To analyze the data, we utilized content and thematic analysis, descriptive and inferential statistics.

Results. We draw two distinct views of research quality. Through a top-down approach, we assessed and evolved a conceptual model of research quality within the ESE research community. Through a bottom-up approach, we built a checklist instrument for assessing survey-based research grounded on supporting literature and evaluated ours and others’ checklists in research practice and research education contexts.

Conclusion. The quality standards we identified and operationalized support and extend the current understanding of research quality for SE research. This is a preliminary, but still vital, step towards a shared understanding and view of research quality for ESE research. Further steps are needed to gain a shared understanding of research quality within the community. 

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2019
Series
Blekinge Institute of Technology Doctoral Dissertation Series, ISSN 1653-2090 ; 7
Keywords
Research Quality, Quality Standards, Empirical Software Engineering, Research Methodology
National Category
Software Engineering
Identifiers
urn:nbn:se:bth-17648 (URN)978-91-7295-372-7 (ISBN)
Public defence
2019-06-14, J1650, Campus Gräsvik, Karlskrona, 13:00 (English)
Opponent
Supervisors
Available from: 2019-03-05 Created: 2019-02-27 Last updated: 2019-05-09Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Molléri, Jefferson SeideAli, Nauman binPetersen, KaiMinhas, Tahir NawazChatzipetrou, Panagiota

Search in DiVA

By author/editor
Molléri, Jefferson SeideAli, Nauman binPetersen, KaiMinhas, Tahir NawazChatzipetrou, Panagiota
By organisation
Department of Software Engineering
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 278 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf