System disruptions
We are currently experiencing disruptions on the search portals due to high traffic. We are working to resolve the issue, you may temporarily encounter an error message.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
The effect of requests for user feedback on Quality of Experience
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering. Blekinge Inst Technol, SE-37179 Karlskrona, Sweden.;Univ Appl Sci & Arts Northwestern Switzerland FHN, Sch Engn, CH-5210 Windisch, Switzerland..
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering. Blekinge Inst Technol, SE-37179 Karlskrona, Sweden.;Univ Appl Sci & Arts Northwestern Switzerland FHN, Sch Engn, CH-5210 Windisch, Switzerland..ORCID iD: 0000-0001-7368-4448
Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics. Blekinge Inst Technol, SE-37179 Karlskrona, Sweden..ORCID iD: 0000-0001-8929-4911
2018 (English)In: Software quality journal, ISSN 0963-9314, E-ISSN 1573-1367, Vol. 26, no 2, p. 385-415Article in journal (Refereed) Published
Abstract [en]

Companies are interested in knowing how users experience and perceive their products. Quality of Experience (QoE) is a measurement that is used to assess the degree of delight or annoyance in experiencing a software product. To assess QoE, we have used a feedback tool integrated into a software product to ask users about their QoE ratings and to obtain information about their rationales for good or bad QoEs. It is known that requests for feedback may disturb users; however, little is known about the subjective reasoning behind this disturbance or about whether this disturbance negatively affects the QoE of the software product for which the feedback is sought. In this paper, we present a mixed qualitative-quantitative study with 35 subjects that explore the relationship between feedback requests and QoE. The subjects experienced a requirement-modeling mobile product, which was integrated with a feedback tool. During and at the end of the experience, we collected the users' perceptions of the product and the feedback requests. Based on the users' rational for being disturbed by the feedback requests, such as "early feedback," "interruptive requests," "frequent requests," and "apparently inappropriate content," we modeled feedback requests. The model defines feedback requests using a set of five-tuple variables: "task," "timing" of the task for issuing the feedback requests, user's "expertise-phase" with the product, the "frequency" of feedback requests about the task, and the "content" of the feedback request. Configuration of these parameters might drive the participants' perceived disturbances. We also found that the disturbances generated by triggering user feedback requests have negligible impacts on the QoE of software products. These results imply that software product vendors may trust users' feedback even when the feedback requests disturb the users.

Place, publisher, year, edition, pages
SPRINGER , 2018. Vol. 26, no 2, p. 385-415
Keywords [en]
Quality of experience, QoE, User feedback, User perception, Human factors
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-16526DOI: 10.1007/s11219-017-9373-7ISI: 000433521200007OAI: oai:DiVA.org:bth-16526DiVA, id: diva2:1220006
Available from: 2018-06-18 Created: 2018-06-18 Last updated: 2021-05-04Bibliographically approved
In thesis
1. Combining User Feedback and Monitoring Data to Support Evidence-based Software Evolution
Open this publication in new window or tab >>Combining User Feedback and Monitoring Data to Support Evidence-based Software Evolution
2020 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Context. Companies continuously explore their software systems to acquire evidence for software evolution, such as bugs in the system and new functional or quality requirements. So far, managers have made decisions about software evolution based on evidence gathered from interpreting user feedback and monitoring data collected separately from software in use. These evidence-collection processes are usually unmethodical, lack a systematic guide, and have practical issues. This lack of a systematic approach leaves unexploited opportunities for detecting evidence for system evolution. Objective. The main research objective is to improve evidence collection from software in use and guide software practitioners in decision-making about system evolution. Understanding useful approaches to collect user feedback and monitoring data, two important sources of evidence, and combining them are key objectives as well. Method. We proposed a method for gathering evidence from software in use (GESU) using design-science research. We designed the method over three iterations and validated it in the European case studies FI-Start, Supersede, and Wise-IoT. To acquire knowledge for the design, we conducted further research using surveys and systematic mapping methods. Results. The results show that GESU is not only successful in industrial environments but also yields new evidence for software evolution by bringing user feedback and monitoring data together. This combination helps software practitioners improve their understanding of end-user needs and system drawbacks, ultimately supporting continuous requirements elicitation and product evolution. GESU suggests monitoring a software system based on its goals to filter relevant data (i.e., goal-driven monitoring) and gathering user feedback when the system requests feedback about the software in use (i.e., system-triggered user feedback). The system identifies interesting situations of system use and issues automated requests for user feedback to interpret the evidence from user perspectives. We justified using goal-driven monitoring and system-triggered user feedback with complementary findings of the thesis. That showed the goals and characteristics of software systems constrain monitoring data. We thus narrowed the monitoring and observational focus on data aligned with goals instead of a massive amount of potentially useless data. Finally, we found that requesting feedback from users with a simple feedback form is a useful approach for motivating users to provide feedback. Conclusion. Combining user feedback and monitoring data is helpful to acquire insights into the success of a software system and guide decision-making regarding its evolution. This work can be extended in the future by implementing an adaptive system for gathering evidence from combined user feedback and monitoring data

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2020
Series
Blekinge Institute of Technology Doctoral Dissertation Series, ISSN 1653-2090 ; 4
Keywords
User feedback, Monitoring data, Evidence-based software engineering, Software evolution
National Category
Software Engineering
Research subject
Software Engineering
Identifiers
urn:nbn:se:bth-19397 (URN)978-91-7295-402-1 (ISBN)
Supervisors
Available from: 2020-04-30 Created: 2020-04-29 Last updated: 2020-12-14Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Fotrousi, FarnazFricker, SamuelFiedler, Markus

Search in DiVA

By author/editor
Fotrousi, FarnazFricker, SamuelFiedler, Markus
By organisation
Department of Software EngineeringDepartment of Technology and Aesthetics
In the same journal
Software quality journal
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 323 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf