Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Exploration on Automated Software Requirement Document Readability Approaches
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
2017 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Context. The requirements analysis phase, as the very beginning of software development process, has been identified as a quite important phase in the software development lifecycle. Software Requirement Specification (SRS) is the output of requirements analysis phase, whose quality factors play an important role in the evaluation work. Readability is a quite important SRS quality factor, but there are few available automated approaches for readability measurement, because of the tight dependency on readers' perceptions. Low readability of SRS documents has a serious impact on the whole process of software development. Therefore, it's extremely urgent to propose effective automated approaches for SRS documents readability measurement. Using traditional readability indexes to analyze readability of SRS documents automatically is a potentially feasible approach. However, the effectiveness of this approach is not systematically evaluated before.

Objectives. In this study, firstly, we aim to understand the readability of texts and investigate approaches to score texts readability manually. Then investigate existing automated readability approaches for texts with their working theories. Next, evaluate the effectiveness of measuring the readability of SRS documents by using these automated readability approaches. Finally, rank these automated approaches by their effectiveness.

Methods. In order to find out the way how human score the readability of texts manually and investigate existing automated readability approaches for texts, systematic literature review is chosen as the research methodology. Experiment is chosen to explore the effectiveness of automated readability approaches.

Results. We find 67 articles after performing systematic literature review. According to systematic literature review, human judging the readability of texts through reading is the most common way of scoring texts readability manually. Additionally, we find four available automated readability assessments tools and seven available automated readability assessments formulas. After executing the experiment, we find the actual value of effectiveness of all selected approaches are not high and Coh-Metrix presents the highest actual value of effectiveness of automated readability approach among the selected approaches.

Conclusions. Coh-Metrix is the most effective automated readability approach, but the feasibility in directly applying Coh-Metrix in SRS documents readability assessments cannot be permitted. Since the actual value of evaluated effectiveness is not high enough. In addition, all selected approaches are based on metrics of readability measures, but no semantic factors are blended in readability assessments. Hence studying more on human perception quantifying and adding semantic analysis in SRS documents readability assessment could be two research directions in future.

Place, publisher, year, edition, pages
2017.
Keyword [en]
Readability Measurement, Software Requirement Specification, Automated Approach
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-14816OAI: oai:DiVA.org:bth-14816DiVA: diva2:1118454
Subject / course
PA2534 Master's Thesis (120 credits) in Software Engineering
Educational program
PAAXA Master of Science Programme in Software Engineering
Supervisors
Available from: 2017-07-03 Created: 2017-06-30 Last updated: 2017-07-03Bibliographically approved

Open Access in DiVA

fulltext(2439 kB)46 downloads
File information
File name FULLTEXT02.pdfFile size 2439 kBChecksum SHA-512
95cf626fc75ae8fffe02133c3ba4420cce29c4d8abc2da72d1246760fc31244af6dbc67c6bd9b44fc3f0287831e975721c65ff6d36519f3afa6671a8d9a7c66d
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Chen, MingdaHe, Yao
By organisation
Department of Software Engineering
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 46 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 23 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf