Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Human Errors and Learnability Evaluation of Authentication System
Blekinge Institute of Technology, School of Computing.
Blekinge Institute of Technology, School of Computing.
2011 (English)Independent thesis Advanced level (degree of Master (Two Years))Student thesis
Abstract [en]

Usability studies are important in today’s context. However, the increased security level of authentication systems is reducing the usability level. Thus, to provide secured but yet usable authentication systems is a challenge for researchers to solve till now. Learnability and human errors are influential factors of the usability of authentication systems. There are not many specific studies on the learnability and human errors concentrating on authentication systems. The authors’ aim of this study is to explore the human errors and the learnability situation of authentication systems to contribute to the development of more usable authentication systems. The authors investigated through observations and interviews to achieve the aim of this study. A minimalist portable test lab was developed in order to conduct the observation process in a controlled environment. At the end of the study, the authors showed the list of identified human errors and learnability issues, and provided recommendations, which the authors believe will help researchers to improve the overall usability of authentication systems. To achieve the aim of the study, the authors started with a systematic literature review to gain knowledge on the state of art. For the user study, a direct investigation, in form of observations and interviews was then applied to gather more data. The collected data was then analyzed and interpreted to identify and assess the human errors and the learnability issues.

Abstract [sv]

This study addressed the usability experiences of users by exploring the human errors and the learnability situation of the authentication systems. Authors conducted a case study to explore the situation of human errors and learnability of authentication systems. Observation and interviews were adapted to gather data. Then analysis through SHERPA (to evaluate human errors) and Grossman et al. learnability metric (to evaluate learnability) had been conducted. First, the authors identified the human errors and learnability issues on the authentication systems from user’s perspective, from the gathered raw data. Then further analysis had been conducted on the summary of the data to identify the features of the authentication systems which are affecting the human errors and learnability issues. The authors then compared the two different categories of authentication systems, such as the 1-factor and the multi-factor authentication systems, from the gathered information through analysis. Finally, the authors argued the possible updates of the SHERPA’s human error metric and additional measurable learnability issues comparing to Grossman et al. learnability metrics. The studied authentication systems are not human errors free. The authors identified eight human errors associated with the studied authentication systems and three features of the authentication systems which are influencing the human errors. These errors occurred while the participants in this study took too long time locating the login menu or button or selecting the correct login method, and eventually took too long time to login. Errors also occurred when the participants failed to operate the code generating devices, or failed to retrieve information from errors messages or supporting documents, and/or eventually failed to login. As these human errors are identifiable and predictable through the SHERPA, they can be solved as well. The authors also found the studied authentication systems have learnability issues and identified nine learnability issues associated with them. These issues were identified when very few users could complete the task optimally, or completed without any help from the documentation. Issues were also identified while analyzing the participants’ task completion time after reviewing documentations, operations on code generating devices, and average errors while performing the task. These learnability issues were identified through Grossman et al. learnability metric, and the authors believe more study on the identified learnability issues can improve the learnability of the authentication systems. Overall, the authors believe more studies should be conducted on the identified human errors and learnability issues to improve the overall human errors and learnability situation of the studied authentication systems at presence. Moreover, these issues also should be taken into consideration while developing future authentication systems. The authors believe, in future, the outcome of this study will also help researchers to propose more usable, but yet secured authentication systems for future growth. Finally, authors proposed some potential research ares, which they believe will have important contribution to the current knowledge. In this study, the authors used the SHERPA to identify the human errors. Though the SHERPA (and its metrics) is arguably one of the best methods to evaluate human errors, the authors believe there are scopes of improvements in the SHERPA’s metrics. Human’s perception and knowledge is getting changed, and to meet the challenge, the SHERPA’s human error metrics can be updated as well. Grossman et al. learnability metrics had been used in this study to identify learnability issues. The authors believe improving the current and adding new metrics may identify more learnability issues. Evaluation of learnability issues may have improved if researchers could have agreed upon a single learnability definition. The authors believe more studies should be conducted on the definition of learnability in order to achieve more acceptable definition of the learnability for further research. Finally, more studies should be conducted on the remedial strategies of the identified human errors, and improvement on the identified learnability issues, which the authors believe will help researchers to propose more usable, but yet secured authentication systems for the future growth.

Place, publisher, year, edition, pages
2011. , 57 p.
Keyword [en]
Human error identification, learnability evaluation, e-banking authentication
National Category
Production Engineering, Human Work Science and Ergonomics Human Computer Interaction Software Engineering
Identifiers
URN: urn:nbn:se:bth-4054Local ID: oai:bth.se:arkivex9D41190CDC7908FBC12579270041B18COAI: oai:DiVA.org:bth-4054DiVA: diva2:831374
Uppsok
Technology
Supervisors
Note
30/1, Shideshwari Lane, Shantinagar, Ramna, Dhaka, Bangladesh, Post Code 1217. Contact: +88017130 16973Available from: 2015-04-22 Created: 2011-10-12 Last updated: 2015-06-30Bibliographically approved

Open Access in DiVA

fulltext(1995 kB)94 downloads
File information
File name FULLTEXT01.pdfFile size 1995 kBChecksum SHA-512
62c043c1da69174cd0a77eca9a92df21a2a06ae7d7c1955ff7a9e6e6418b14b030faf13b1a0f6c3f6ace0b0fcb70d99b4cf7e3138814301b6cf9ebc675f02f51
Type fulltextMimetype application/pdf

By organisation
School of Computing
Production Engineering, Human Work Science and ErgonomicsHuman Computer InteractionSoftware Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 94 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 74 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf