Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Quality Assessment Instrument for Systematic Literature Reviews in Software Engineering
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0002-8132-0107
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0001-7266-5632
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0003-0460-5253
2023 (English)In: e-Informatica Software Engineering Journal, ISSN 1897-7979, E-ISSN 2084-4840, Vol. 17, no 1, article id 230105Article in journal (Refereed) Published
Abstract [en]

Background: Systematic literature reviews (SLRs) have become a standard practice as part of software engineering (SE) research, although their quality varies. To build on the reviews, both for future research and industry practice, they need to be of high quality.Aim: To assess the quality of SLRs in SE, we put forward an appraisal instrument for SLRs.Method: A well-established appraisal instrument from research in healthcare was used as a starting point to develop the instrument. It is adapted to SE using guidelines, checklists, and experiences from SE. The first version was reviewed by four external experts on SLRs in SE and updated based on their feedback. To demonstrate its use, the updated version was also used by the authors to assess a sample of six selected systematic literature studies.Results: The outcome of the research is an appraisal instrument for quality assessment of SLRs in SE. The instrument includes 15 items with different options to capture the quality. The instrument also supports consolidating the items into groups, which are then used to assess the overall quality of an SLR.Conclusion: The presented instrument may be helpful support for an appraiser in assessing the quality of SLRs in SE.

Place, publisher, year, edition, pages
Wroclaw University of Technology, 2023. Vol. 17, no 1, article id 230105
Keywords [en]
Systematic reviews, quality assessment, critical appraisal, AMSTAR 2, systematic literature review, tertiary study
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-24415DOI: 10.37190/e-Inf230105ISI: 000944209900001Scopus ID: 2-s2.0-85152967598OAI: oai:DiVA.org:bth-24415DiVA, id: diva2:1749011
Part of project
VITS- Visualisation of test data for decision support, Knowledge FoundationOSIR- Open Source Inspired Reuse, Knowledge Foundation
Funder
Knowledge Foundation, 20180127Knowledge Foundation, 20190081ELLIIT - The Linköping‐Lund Initiative on IT and Mobile CommunicationsAvailable from: 2023-04-05 Created: 2023-04-05 Last updated: 2023-04-28Bibliographically approved

Open Access in DiVA

fulltext(694 kB)589 downloads
File information
File name FULLTEXT01.pdfFile size 694 kBChecksum SHA-512
87105354c1de72e9798f91e5dd2187ee78aad03f63d8f05fdc05fd0c8e7e26cda428206bef9fcc8b7ee9df8f66acc8fd797b70f3f81912d4b8c0e31b053ae899
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Usman, MuhammadAli, Nauman binWohlin, Claes

Search in DiVA

By author/editor
Usman, MuhammadAli, Nauman binWohlin, Claes
By organisation
Department of Software Engineering
In the same journal
e-Informatica Software Engineering Journal
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 589 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 241 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf