Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Using Bert To Measure Objective Quality Of Rest-Api Specifications: Automated Approach For Quality Measurement
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
2023 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

Each day, the need for as well as the amount of network-based applications grows and with it the implementation of RESTful APIs. For all these APIs there is a need for documentation of the API's behavior, its benefits, how it interacts with other APIs, and its expected result. To solve this; An API specification is constructed. This is a document containing the design philosophy of the APIs and can act as a guideline for how they should be constructed. When designing API specifications it is often difficult to understand what objective quality the API document upholds.

To understand the objective quality of an API specification it must first be understood what a good objective quality is in this regard. We used static code tests (linter rules) that are mapped to three quality attributes that fit the industry's consensus of the most important quality attributes that need to be complacent for a good quality API. We then implemented an automatic process of splitting API specifications into positive and negative training data using the linter results of the rules. The resulting data is used to train our BERT model.The model will then be able to give an objective score to unseen API specifications. We then used a saliency map (textual heatmap) in order to understand BERT's decisions, which added the potential to generate new linter rules from the given results.

After testing unseen API specifications on our BERT model, we saw that it was able to generate a reasonable quality score. Although, when inserting smaller features to generate a textual heatmap, the predictions of our model were not correct, hence not making it possible to understand BERT's decisions through our implementation. This also meant that new rules could not be acquired from reviewing the BERT's result. 

Place, publisher, year, edition, pages
2023.
Keywords [en]
API Specification, BERT, Objective quality, Quality attributes, Sailency highlighting
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-24842OAI: oai:DiVA.org:bth-24842DiVA, id: diva2:1768777
External cooperation
Lennart Isaksson
Subject / course
PA1445 Kandidatkurs i Programvaruteknik
Educational program
PAGPT Software Engineering
Supervisors
Examiners
Available from: 2023-06-20 Created: 2023-06-15 Last updated: 2023-06-20Bibliographically approved

Open Access in DiVA

fulltext(578 kB)165 downloads
File information
File name FULLTEXT01.pdfFile size 578 kBChecksum SHA-512
213cf39c2af1037a63c1f1bea1d60e5ee211a4754e8faa5e070f6b9061cd98d0b10ba1c6853cd8dcaf52bf6aa9701671619701a2e6ba0bf5acaed5d32aa84a7d
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Eriksson, FritzÅkesson, Max
By organisation
Department of Software Engineering
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 165 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 256 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf