Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Towards a methodology for participant selection in software engineering experiments a vision of the future
LUT University, FIN.
Universidad Politecnica de Madrid, ESP.
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.ORCID iD: 0000-0002-0679-4361
Universidad Politecnica de Madrid, ESP.
2021 (English)In: International Symposium on Empirical Software Engineering and Measurement, IEEE Computer Society , 2021, article id 35Conference paper, Published paper (Refereed)
Abstract [en]

Background. Software Engineering (SE) researchers extensively perform experiments with human subjects. Well-defined samples are required to ensure external validity. Samples are selected purposely or by convenience, limiting the generalizability of results. Objective.We aim to depict the current status of participants selection in empirical SE, identifying the main threats and how they are mitigated. We draft a robust approach to participants selection. Method. We reviewed existing participants selection guidelines in SE, and performed a preliminary literature review to find out how participants selection is conducted in SE in practice. Results. We outline a new selection methodology, by 1) defining the characteristics of the desired population, 2) locating possible sources of sampling available for researchers, and 3) identifying and reducing the "distance" between the selected sample and its corresponding population. Conclusion. We propose a roadmap to develop and empirically validate the selection methodology. © 2021 IEEE Computer Society. All rights reserved.

Place, publisher, year, edition, pages
IEEE Computer Society , 2021. article id 35
Series
International Symposium on Empirical Software Engineering and Measurement, ISSN 1949-3770, E-ISSN 1949-3789 ; 15
Keywords [en]
Controlled Experiment, Empirical Software Engineering, Generalizability, Participant Selection, Threats to Validity, Current status, External validities, Human subjects, Robust approaches, Software engineering experiments, Threat to validity, Software engineering
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-22295DOI: 10.1145/3475716.3484273Scopus ID: 2-s2.0-85117919005ISBN: 9781450386654 (print)OAI: oai:DiVA.org:bth-22295DiVA, id: diva2:1609378
Conference
15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2021, Virtual, Online, 11 October through 15 October
Note

open access

Available from: 2021-11-08 Created: 2021-11-08 Last updated: 2022-12-02Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopusarXiv.org

Authority records

Fucci, Davide

Search in DiVA

By author/editor
Fucci, Davide
By organisation
Department of Software Engineering
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 21 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf