Planned maintenance
A system upgrade is planned for 10/12-2024, at 12:00-13:00. During this time DiVA will be unavailable.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Comparison of Citation Sources for Reference and Citation-Based Search in Systematic Literature Reviews
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0001-7266-5632
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0003-2300-068x
2022 (English)In: e-Informatica Software Engineering Journal, ISSN 1897-7979, E-ISSN 2084-4840, Vol. 16, no 1, article id 220106Article, review/survey (Refereed) Published
Abstract [en]

Context: In software engineering, snowball sampling has been used as a supplementary and primary search strategy. The current guidelines recommend using Google Scholar (GS) for snowball sampling. However, the use of GS presents several challenges when using it as a source for citations and references. Objective: To compare the effectiveness and usefulness of two leading citation databases (GS and Scopus) for use in snowball sampling search. Method: We relied on a published study that has used snowball sampling as a search strategy and GS as the citation source. We used its primary studies to compute precision and recall for Scopus. Results: In this particular case, Scopus was highly effective with 95% recall and had better precision of 5.1% compared to GS’s 2.8%. Moreover, Scopus found nine additional relevant papers. On average, one would read approximately 15 extra papers in GS than Scopus to identify one additional relevant paper. Furthermore, Scopus supports batch downloading of both citations and papers’ references, has better quality metadata, and does better source filtering. Conclusion: This study suggests that Scopus seems to be more effective and useful for snowball sampling than GS for systematic secondary studies attempting to identify peer-reviewed literature. EVIE © 2022 The Authors.

Place, publisher, year, edition, pages
Wroclaw University of Technology, 2022. Vol. 16, no 1, article id 220106
Keywords [en]
Software engineering, Citation-based, Google scholar, Primary search, Reference-based, Search strategies, Snowball sampling, Snowballing, Systematic literature review, Systematic mapping, Systematic Review, Paper, Search strategy
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-23036DOI: 10.37190/e-Inf220106ISI: 001126480300001Scopus ID: 2-s2.0-85130247854OAI: oai:DiVA.org:bth-23036DiVA, id: diva2:1663616
Part of project
VITS- Visualisation of test data for decision support, Knowledge FoundationSERT- Software Engineering ReThought, Knowledge Foundation
Funder
Knowledge Foundation, 20180127ELLIIT - The Linköping‐Lund Initiative on IT and Mobile CommunicationsKnowledge Foundation, 20180010
Note

open access

Available from: 2022-06-02 Created: 2022-06-02 Last updated: 2024-08-06Bibliographically approved

Open Access in DiVA

fulltext(149 kB)358 downloads
File information
File name FULLTEXT01.pdfFile size 149 kBChecksum SHA-512
d813d8f3365ffb988cdb263130cfe4ef6e00c0fc696ed24024ac40505f18033c67f741b5519bf9db0e5ce71cb462cb992bcf9bf45e283f60d8f595b543355ea5
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Ali, Nauman binTanveer, Binish

Search in DiVA

By author/editor
Ali, Nauman binTanveer, Binish
By organisation
Department of Software Engineering
In the same journal
e-Informatica Software Engineering Journal
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 359 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 554 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf