Endre søk
Begrens søket
123 1 - 50 of 121
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Treff pr side
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
Merk
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Ahlgren, Filip
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Local And Network Ransomware Detection Comparison2019Independent thesis Basic level (degree of Bachelor), 10 poäng / 15 hpOppgave
    Abstract [en]

    Background. Ransomware is a malicious application encrypting important files on a victim's computer. The ransomware will ask the victim for a ransom to be paid through cryptocurrency. After the system is encrypted there is virtually no way to decrypt the files other than using the encryption key that is bought from the attacker.

    Objectives. In this practical experiment, we will examine how machine learning can be used to detect ransomware on a local and network level. The results will be compared to see which one has a better performance.

    Methods. Data is collected through malware and goodware databases and then analyzed in a virtual environment to extract system information and network logs. Different machine learning classifiers will be built from the extracted features in order to detect the ransomware. The classifiers will go through a performance evaluation and be compared with each other to find which one has the best performance.

    Results. According to the tests, local detection was both more accurate and stable than network detection. The local classifiers had an average accuracy of 96% while the best network classifier had an average accuracy of 89.6%.

    Conclusions. In this case the results show that local detection has better performance than network detection. However, this can be because the network features were not specific enough for a network classifier. The network performance could have been better if the ransomware samples consisted of fewer families so better features could have been selected.

  • 2.
    Ahmadi Mehri, Vida
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Towards Secure Collaborative AI Service Chains2019Licentiatavhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    At present, Artificial Intelligence (AI) systems have been adopted in many different domains such as healthcare, robotics, automotive, telecommunication systems, security, and finance for integrating intelligence in their services and applications. The intelligent personal assistant such as Siri and Alexa are examples of AI systems making an impact on our daily lives. Since many AI systems are data-driven systems, they require large volumes of data for training and validation, advanced algorithms, computing power and storage in their development process. Collaboration in the AI development process (AI engineering process) will reduce cost and time for the AI applications in the market. However, collaboration introduces the concern of privacy and piracy of intellectual properties, which can be caused by the actors who collaborate in the engineering process.  This work investigates the non-functional requirements, such as privacy and security, for enabling collaboration in AI service chains. It proposes an architectural design approach for collaborative AI engineering and explores the concept of the pipeline (service chain) for chaining AI functions. In order to enable controlled collaboration between AI artefacts in a pipeline, this work makes use of virtualisation technology to define and implement Virtual Premises (VPs), which act as protection wrappers for AI pipelines. A VP is a virtual policy enforcement point for a pipeline and requires access permission and authenticity for each element in a pipeline before the pipeline can be used.  Furthermore, the proposed architecture is evaluated in use-case approach that enables quick detection of design flaw during the initial stage of implementation. To evaluate the security level and compliance with security requirements, threat modeling was used to identify potential threats and vulnerabilities of the system and analyses their possible effects. The output of threat modeling was used to define countermeasure to threats related to unauthorised access and execution of AI artefacts.

  • 3.
    Andres, Bustamante
    et al.
    Tecnológico de Monterrey, MEX.
    Cheddad, Abbas
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Rodriguez-Garcia, Alejandro
    Tecnológico de Monterrey, MEX.
    Digital Image Processing and Development of Machine Learning Models for the Discrimination of Corneal Pathology: An Experimental Model2019Konferansepaper (Fagfellevurdert)
  • 4.
    Angelova, Milena
    et al.
    Technical University of sofia, BUL.
    Vishnu Manasa, Devagiri
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Boeva, Veselka
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Linde, Peter
    Blekinge Tekniska Högskola, Biblioteket.
    Lavesson, Niklas
    An Expertise Recommender System based on Data from an Institutional Repository (DiVA)2019Inngår i: Connecting the Knowledge Common from Projects to sustainable Infrastructure: The 22nd International conference on Electronic Publishing - Revised Selected Papers / [ed] Leslie Chan, Pierre Mounier, OpenEdition Press , 2019, s. 135-149Kapittel i bok, del av antologi (Fagfellevurdert)
    Abstract [en]

    Finding experts in academics is an important practical problem, e.g. recruiting reviewersfor reviewing conference, journal or project submissions, partner matching for researchproposals, finding relevant M. Sc. or Ph. D. supervisors etc. In this work, we discuss anexpertise recommender system that is built on data extracted from the Blekinge Instituteof Technology (BTH) instance of the institutional repository system DiVA (DigitalScientific Archive). DiVA is a publication and archiving platform for research publicationsand student essays used by 46 publicly funded universities and authorities in Sweden andthe rest of the Nordic countries (www.diva-portal.org). The DiVA classification system isbased on the Swedish Higher Education Authority (UKÄ) and the Statistic Sweden's (SCB)three levels classification system. Using the classification terms associated with studentM. Sc. and B. Sc. theses published in the DiVA platform, we have developed a prototypesystem which can be used to identify and recommend subject thesis supervisors in academy.

  • 5.
    Anwar, Mahwish
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Connect2smallports Project: South Baltic Small Ports – Gateway to Integratedand Sustainable European Transport System: Project brief and updates on the project activities: Digital Audit. Blockchain Design Strategy. Call for Collaboration. Reports and scientific publications.2019Annet (Annet (populærvitenskap, debatt, mm))
    Abstract [en]

    Ports play pivotal role in the global supply chain network. To strengthen the ports’ business and to keep up with the overall economic development of the country, the port stakeholders have started to invest in the digital technologies. The ports as well as the individual municipalities in Europe are in contest with other ports within the region. One of the competing factors is that of the port’s technological development. Unlike large ports, the small ports within Europe, for example Port of Karlskrona, Port of Wismar or Port of Klaipeda, which also serve as crucial nodes within the trade flow for Sweden, Germany and Lithuania respectively, lack the knowledge and tools to leverage the potential of digital technologies. The digital disruption at ports is inevitable!

    With that being established the scope of the project - South Baltic Small Ports as Gateways towards Integrated Sustainable European Transport System and Blue Growth by Smart Connectivity Solutions – or Connect2SmallPorts project is to understand how to facilitate small and medium ports of the South Baltic region with the digital technologies - Blockchain and Internet of Things. During the project lifetime (2018 to 2021) the main goals are to perform the digital audit of small ports of the South Baltic region; to prepare an implementation strategy for Blockchain and Internet of Things specifically for the small ports in the South Baltic region and to conduct an evaluation of the proposed strategies. The project correspondingly aims to disseminate the knowledge and experiences through research publications, industrial conferences and international tradeshows.

  • 6.
    Anwar, Mahwish
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Digitalization in Container Terminal Logistics: A Literature Review2019Inngår i: 27th Annual Conference of International Association of Maritime Economists (IAME), 2019, s. 1-25, artikkel-id 141Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Many terminals that are located in large ports, such as Port of Rotterdam, Port of Singapore, Port of Hamburg, etc. employ various emerging digital technologies to handle container and information. Some technologies deemed attractive by large ports are: Artificial Intelligence (AI), Cloud Computing, Blockchain and Internet of Things (IoT). The objective of this paper is to review the “state-of-the-art” of scientific literature on digital technologies that facilitate operations management for container terminal logistics. The studies are synthesized in form of a classification matrix and analysis performed. The primary studies consisted of 57 papers, out of the initial pool of over 2100 findings. Over 94% of the publications identified focused on AI; while 29% exploited IoT and Cloud Computing technologies combined. The research on Blockchain within the context of container terminal was nonexistent. Majority of the publications utilized numerical experiments and simulation for validation. A large amount of the scientific literature was dedicated to resource management and scheduling of intra-logistic equipment/vessels or berth or container storage in the yard. Results drawn from the literature survey indicate that various research gaps exist. A discussion and an analysis of review is presented, which could be of benefit for stakeholders of small-medium sized container terminals.

  • 7.
    Anwar, Mahwish
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Henesey, Lawrence
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Casalicchio, Emiliano
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    The feasibility of Blockchain solutions in the maritime industry2019Konferansepaper (Annet vitenskapelig)
    Abstract [en]

    Purpose / Value

    The concept of Blockchain technology in supply chain management is well discussed, yet

    inadequately theorized in terms of its applicability, especially within the maritime industry,

    which forms a fundamental node of the entire supply chain network. More so, the assumptive

    grounds associated with the technology have not been openly articulated, leading to unclear

    ideas about its applicability.

    Design/methodology/approach

    The research is designed divided into two Stages. This paper (Stage one) enhanced

    literature review for data collection in order to gauge the properties of the Blockchain

    technology, and to understand and map those characteristics with the Bill of Lading

    process within maritime industry. In Stage two an online questionnaire is conducted to

    assess the feasibility of Blockchain technology for different maritime use-cases.

    Findings

    The research that was collected and analysed partly from deliverable in the

    Connect2SmallPort Project and from other literature suggests that Blockchain can be an

    enabler for improving maritime supply chain. The use-case presented in this paper highlights

    the practicality of the technology. It was identified that Blockchain possess characteristics

    suitable to mitigate the risks and issues pertaining to the paper-based Bill of Lading process.

    Research limitations

    The study would mature further after the execution of the Stage Two. By the end of both

    Stages, a framework for Blockchain adoption with a focus on the maritime industry would

    be proposed.

    Practical implications

    The proposed outcome indicated the practicality of technology, which could be beneficial

    for the port stakeholders that wish to use Blockchain in processing Bill of Lading or

    contracts.

    Social implications

    The study may influence the decision makers to consider the benefits of using the Blockchain

    technology, thereby, creating opportunities for the maritime industry to leverage the

    technology with government’s support.

  • 8.
    Arredal, Martin
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Eye Tracking’s Impact on Player Performance and Experience in a 2D Space Shooter Video Game.2018Independent thesis Basic level (degree of Bachelor), 10 poäng / 15 hpOppgave
    Abstract [en]

    Background. Although a growing market, most of the commercially available gamestoday that features eye tracking support is rendered in a 3D perspective. Games ren-dered in 2D have seen little support for eye trackers from developers. By comparing the differences in player performance and experience between an eye tracker and acomputer mouse when playing a classic 2D genre: space shooter, this thesis aim tomake an argument for the implementation of eye tracking in 2D video games.

    Objectives. Create a 2D space shooter video game where movement will be handledthrough a keyboard but the input method for aiming will alter between a computermouse and an eye tracker.

    Methods. Using a Tobii EyeX eye tracker, an experiment was conducted with fif-teen participants. To measure their performance, three variables was used: accuracy,completion time and collisions. The participants played two modes of a 2D spaceshooter video game in a controlled environment. Depending on which mode wasplayed, the input method for aiming was either an eye tracker or a computer mouse.The movement was handled using a keyboard for both modes. When the modes hadbeen completed, a questionnaire was presented where the participants would ratetheir experience playing the game with each input method.

    Results. The computer mouse had a better performance in two out of three per-formance variables. On average the computer mouse had a better accuracy andcompletion time but more collisions. However, the data gathered from the question-naire shows that the participants had on average a better experience when playingwith an eye tracker

    Conclusions. The results from the experiment shows a better performance for par-ticipants using the computer mouse, but participants felt more immersed with the eyetracker and giving it a better score on all experience categories. With these results,this study hope to encourage developers to implement eye tracking as an interactionmethod for 2D video games. However, future work is necessary to determine if theexperience and performance increase or decrease as the playtime gets longer.

  • 9.
    Avdic, Adnan
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Ekholm, Albin
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Anomaly Detection in an e-Transaction System using Data Driven Machine Learning Models: An unsupervised learning approach in time-series data2019Independent thesis Basic level (degree of Bachelor), 10 poäng / 15 hpOppgave
    Abstract [en]

    Background: Detecting anomalies in time-series data is a task that can be done with the help of data driven machine learning models. This thesis will investigate if, and how well, different machine learning models, with an unsupervised approach,can detect anomalies in the e-Transaction system Ericsson Wallet Platform. The anomalies in our domain context is delays on the system.

    Objectives: The objectives of this thesis work is to compare four different machine learning models ,in order to find the most relevant model. The best performing models are decided by the evaluation metric F1-score. An intersection of the best models are also being evaluated in order to decrease the number of False positives in order to make the model more precise.

    Methods: Investigating a relevant time-series data sample with 10-minutes interval data points from the Ericsson Wallet Platform was used. A number of steps were taken such as, handling data, pre-processing, normalization, training and evaluation.Two relevant features was trained separately as one-dimensional data sets. The two features that are relevant when finding delays in the system which was used in this thesis is the Mean wait (ms) and the feature Mean * N were the N is equal to the Number of calls to the system. The evaluation metrics that was used are True positives, True Negatives, False positives, False Negatives, Accuracy, Precision, Recall, F1-score and Jaccard index. The Jaccard index is a metric which will reveal how similar each algorithm are at their detection. Since the detection are binary, it’s classifying the each data point in the time-series data.

    Results: The results reveals the two best performing models regards to the F1-score.The intersection evaluation reveals if and how well a combination of the two best performing models can reduce the number of False positives.

    Conclusions: The conclusion to this work is that some algorithms perform better than others. It is a proof of concept that such classification algorithms can separate normal from non-normal behavior in the domain of the Ericsson Wallet Platform.

  • 10.
    Bandari Swamy Devender, Vamshi Krishna
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Adike, Sneha
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Design and Performance of an Event Handling and Analysis Platform for vSGSN-MME event using the ELK stack2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Data Logging is the main activity to be considered in maintaining a server or database in working condition without any errors or failures. Data collection can be automatic, so, no human presence is necessary. To store the data of logs for many days and visualizing became a huge problem in recent days. Coming to node SGSN-MME, which is the main component of the GPRS network, which handles all packet switched data within the mobile operator's network. A lot of log data is generated and stored in file systems on the redundant File Server Boards in SGSN-MME node. The evolution of the SGSN-MME is taking it from dedicated, purpose-built, hardware into virtual machines in the Cloud, where virtual file server boards fit very badly. The purpose of this thesis is to give a better way to store the log data and add visualization using the ELK stack concept. Fetching useful information from logs is one of the most important part of this stack and is being done in Logstash using its grok filters and a set of input, filter and output plug-ins which helps to scale this functionality for taking various kinds of inputs ( file,TCP, UDP, gemfire, stdin, UNIX, web sockets and even IRC and twitter and many more) , filter them using (groks,grep,date filters etc.)and finally write output to ElasticSearch. The Research Methodology involved in carrying out this thesis work is a Qualitative approach. A study is carried using the ELK concept with respect to Legacy approach in Ericsson company. A suitable approach and the better possible solution is given to the vSGSN-MME node to store the log data. Also to provide the performance and uses multiple users of input providers and provides the analysis of the graphs from the results and analysis. To perform the tests accurately, readings are taken in defined failure scenarios. From the test cases, a plot is provided on the CPU load in vSGSN-MME which easily gives the suitable and best promising way.

  • 11.
    Bergenholtz, Erik
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Moss, Andrew
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Ilie, Dragos
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Casalicchio, Emiliano
    Finding a needle in a haystack -  A comparative study of IPv6 scanning methods2019Inngår i: Proceeding of The 6th International Symposium on Networks, Computers and Communications (ISNCC 2019), IEEE, 2019Konferansepaper (Fagfellevurdert)
    Abstract [en]

    It has previously been assumed that the size of anIPv6 network would make it impossible to scan the network forvulnerable hosts. Recent work has shown this to be false, andseveral methods for scanning IPv6 networks have been suggested.However, most of these are based on external information likeDNS, or pattern inference which requires large amounts of knownIP addresses. In this paper, DeHCP, a novel approach based ondelimiting IP ranges with closely clustered hosts, is presentedand compared to three previously known scanning methods. Themethod is shown to work in an experimental setting with resultscomparable to that of the previously suggested methods, and isalso shown to have the advantage of not being limited to a specificprotocol or probing method. Finally we show that the scan canbe executed across multiple VLANs.

  • 12.
    Bergman Martinkauppi, Louise
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    He, Qiuping
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Performance Evaluation and Comparison of Standard Cryptographic Algorithms and Chinese Cryptographic Algorithms2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Background. China is regulating the import, export, sale, and use of encryption technology in China. If any foreign company wants to develop or release a product in China, they need to report their use of any encryption technology to the Office of State Commercial Cryptography Administration (OSCCA) to gain approval. SM2, SM3, and SM4 are cryptographic standards published by OSCCA and are authorized to be used in China. To comply with Chinese cryptography laws organizations and companies may have to replace standard cryptographic algorithms in their systems with Chinese cryptographic algorithms, such as SM2, SM3, and SM4. It is important to know beforehand how the replacement of algorithms will impact performance to determine future system costs. Objectives. Perform a theoretical study and performance comparison of the standard cryptographic algorithms and Chinese Cryptographic algorithms. The standard cryptographic algorithms studied are RSA, ECDSA, SHA-256, and AES-128, and the Chinese cryptographic algorithms studied are SM2, SM3, and SM4. Methods. A literature analysis was conducted to gain knowledge and collect information about the selected cryptographic algorithms in order to make a theoretical comparison of the algorithms. An experiment was conducted to get measurements of how the algorithms perform and to be able to rate them. Results. The literature analysis provides a comparison that identifies design similarities and differences between the algorithms. The controlled experiment provides measurements of the metrics of the algorithms mentioned in objectives. Conclusions. The conclusions are that the digital signature algorithms SM2 and ECDSA have similar design and also similar performance. SM2 and RSA have fundamentally different designs, and SM2 performs better than RSA when generating keys and signatures. When verifying signatures, RSA shows comparable performance in some cases and worse performance in other cases. Hash algorithms SM3 and SHA-256 have many design similarities, but SHA-256 performs slightly better than SM3. AES-128 and SM4 have many similarities but also a few differences. In the controlled experiment, AES-128 outperforms SM4 with a significant margin.

  • 13.
    Bertoni, Alessandro
    et al.
    Blekinge Tekniska Högskola, Fakulteten för teknikvetenskaper, Institutionen för maskinteknik. Blekinge Institute of Technology.
    Hallstedt, Sophie
    Blekinge Tekniska Högskola, Fakulteten för teknikvetenskaper, Institutionen för strategisk hållbar utveckling. Blekinge Institute of Technology.
    Dasari, Siva Krishna
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap. Blekinge Institute of Technology.
    Andersson, Petter
    GKN Aerospace Engine Systems, SWE.
    Integration of Value and Sustainability Assessment in Design Space Exploration by Machine Learning: An Aerospace Application2019Inngår i: Design ScienceArtikkel i tidsskrift (Fagfellevurdert)
  • 14.
    Bisen, Pradeep Siddhartha Singh
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Predicting Operator’s Choice During Airline Disruption Using Machine Learning Methods2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    This master thesis is a collaboration with Jeppesen, a Boeing company to attempt applying machine learning techniques to predict “When does Operator manually solve the disruption? If he chooses to use Optimiser, then which option would he choose? And why?”. Through the course of this project, various techniques are employed to study, analyze and understand the historical labeled data of airline consisting of alerts during disruptions and tries to classify each data point into one of the categories: manual or optimizer option. This is done using various supervised machine learning classification methods.

  • 15.
    Björkman, Adam
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Kardos, Max
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Threat Analysis of Smart Home Assistants Involving Novel Acoustic Based Attack-Vectors2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Background. Smart home assistants are becoming more common in our homes. Often taking the form of a speaker, these devices enable communication via voice commands. Through this communication channel, users can for example order a pizza, check the weather, or call a taxi. When a voice command is given to the assistant, the command is sent to cloud services over the Internet, enabling a multitude of functions associated with risks regarding security and privacy. Furthermore, with an always active Internet connection, smart home assistants are a part of the Internet of Things, a type of historically not secure devices. Therefore, it is crucial to understand the security situation and the risks that a smart home assistant brings with it.

    Objectives. This thesis aims to investigate and compile threats towards smart home assistants in a home environment. Such a compilation could be used as a foundation during the creation of a formal model for securing smart home assistants and other devices with similar properties.

    Methods. Through literature studies and threat modelling, current vulnerabilities towards smart home assistants and systems with similar properties were found and compiled. A few  vulnerabilities were tested against two smart home assistants through experiments to verify which vulnerabilities are present in a home environment. Finally, methods for the prevention and protection of the vulnerabilities were found and compiled.

    Results. Overall, 27 vulnerabilities towards smart home assistants and 12 towards similar systems were found and identified. The majority of the found vulnerabilities focus on exploiting the voice interface. In total, 27 methods to prevent vulnerabilities in smart home assistants or similar systems were found and compiled. Eleven of the found vulnerabilities did not have any reported protection methods. Finally, we performed one experiment consisting of four attacks against two smart home assistants with mixed results; one attack was not successful, while the others were either completely or partially successful in exploiting the target vulnerabilities.

    Conclusions. We conclude that vulnerabilities exist for smart home assistants and similar systems. The vulnerabilities differ in execution difficulty and impact. However, we consider smart home assistants safe enough to usage with the accompanying protection methods activated.

  • 16.
    Boeva, Veselka
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Nordahl, Christian
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Modeling Evolving User Behavior via Sequential Clustering2019Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In this paper we address the problem of modeling the evolution of clusters over time by applying sequential clustering. We propose a sequential partitioning algorithm that can be applied for grouping distinct snapshots of streaming data so that a clustering model is built on each data snapshot. The algorithm is initialized by a clustering solution built on available historical data. Then a new clustering solution is generated on each data snapshot by applying a partitioning algorithm seeded with the centroids of the clustering model obtained at the previous time interval. At each step the algorithm also conducts model adapting operations in order to reflect the evolution in the clustering structure. In that way, it enables to deal with both incremental and dynamic aspects of modeling evolving behavior problems. In addition, the proposed approach is able to trace back evolution through the detection of clusters' transitions, such as splits and merges. We have illustrated and initially evaluated our ideas on household electricity consumption data. The results have shown that the proposed sequential clustering algorithm is robust to modeling evolving behavior by being enable to mine changes and update the model, respectively.

  • 17.
    Boinapally, Kashyap
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Security Certificate Renewal Management2019Independent thesis Advanced level (degree of Master (Two Years)), 80 poäng / 120 hpOppgave
    Abstract [en]

    Context. An SSL encrypted client-server communication is necessary to maintain the security and privacy of the communication. For an SSL encryption to work, there should be a security certificate which has a certain expiry period. Periodic renewal of the certificate after its expiry is a waste of time and an effort on part of the company.

    Objectives. In this study, a new system has been developed and implemented, which sends a certificate during prior communication and does not wait for the certificate to expire. Automating the process to a certain extent was done to not compromise the security of the system and to speed up the process and reduce the downtime.

    Methods. Experiments have been conducted to test the new system and compare it to the old system. The experiments were conducted to analyze the packets and the downtime occurring from certificate renewal.

    Results. The results of the experiments show that there is a significant reduction in downtime. This was achieved due to the implementation of the new system and semi-automation

    Conclusions. The system has been implemented, and it greatly reduces the downtime occurring due to the expiry of the security certificates. Semi-Automation has been done to not hamper the security and make the system robust.

  • 18.
    Boldt, Martin
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Boeva, Veselka
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Borg, Anton
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Multi-expert estimations of burglars' risk exposure and level of pre-crime preparation using coded crime scene data: Work in progress2018Inngår i: Proceedings - 2018 European Intelligence and Security Informatics Conference, EISIC 2018 / [ed] Brynielsson, J, Institute of Electrical and Electronics Engineers Inc. , 2018, s. 77-80Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Law enforcement agencies strive to link crimes perpetrated by the same offenders into crime series in order to improve investigation efficiency. Such crime linkage can be done using both physical traces (e.g., DNA or fingerprints) or 'soft evidence' in the form of offenders' modus operandi (MO), i.e. their behaviors during crimes. However, physical traces are only present for a fraction of crimes, unlike behavioral evidence. This work-in-progress paper presents a method for aggregating multiple criminal profilers' ratings of offenders' behavioral characteristics based on feature-rich crime scene descriptions. The method calculates consensus ratings from individual experts' ratings, which then are used as a basis for classification algorithms. The classification algorithms can automatically generalize offenders' behavioral characteristics from cues in the crime scene data. Models trained on the consensus rating are evaluated against models trained on individual profiler's ratings. Thus, whether the consensus model shows improved performance over individual models. © 2018 IEEE.

  • 19.
    Boldt, Martin
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Borg, Anton
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Ickin, Selim
    Ericsson Research, SWE.
    Gustafsson, Jörgen
    Ericsson Research, SWE.
    Anomaly detection of event sequences using multiple temporal resolutions and Markov chains2019Inngår i: Knowledge and Information Systems, ISSN 0219-1377, E-ISSN 0219-3116Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Streaming data services, such as video-on-demand, are getting increasingly more popular, and they are expected to account for more than 80% of all Internet traffic in 2020. In this context, it is important for streaming service providers to detect deviations in service requests due to issues or changing end-user behaviors in order to ensure that end-users experience high quality in the provided service. Therefore, in this study we investigate to what extent sequence-based Markov models can be used for anomaly detection by means of the end-users’ control sequences in the video streams, i.e., event sequences such as play, pause, resume and stop. This anomaly detection approach is further investigated over three different temporal resolutions in the data, more specifically: 1 h, 1 day and 3 days. The proposed anomaly detection approach supports anomaly detection in ongoing streaming sessions as it recalculates the probability for a specific session to be anomalous for each new streaming control event that is received. Two experiments are used for measuring the potential of the approach, which gives promising results in terms of precision, recall, F 1 -score and Jaccard index when compared to k-means clustering of the sessions. © 2019, The Author(s).

  • 20.
    Bond, David
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Nyblom, Madelein
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Evaluation of four different virtual locomotion techniques in an interactive environment2019Independent thesis Basic level (degree of Bachelor), 10 poäng / 15 hpOppgave
    Abstract [en]

    Background: Virtual Reality (VR) devices are becoming more and more common as game systems. Even though modern VR Head Mounted Displays (HMD) allow the user to walk in real life, it still limits the user to the space of the room they are playing in and the player will need virtual locomotion in games where the environment size exceeds that of the real life play area. Evaluations of multiple VR locomotion techniques have already been done, usually evaluating motion sickness or usability. A common theme in many of these is that the task is search based, in an environment with low focus on interaction. Therefore in this thesis, four VR locomotion techniques are evaluated in an environment with focus on interaction, to see if a difference exists and whether one technique is optimal. The VR locomotion techniques are: Arm-Swinging, Point-Tugging, Teleportation, and Trackpad.

    Objectives: A VR environment is created with focus on interaction in this thesis. In this environment the user has to grab and hold onto objects while using a locomotion technique. This study then evaluates which VR locomotion technique is preferred in the environment. This study also evaluates whether there is a difference in preference and motion sickness, in an environment with high focus in interaction compared to one with low focus.

    Methods: A user study was conducted with 15 participants. Every participant performed a task with every VR locomotion technique, which involved interaction. After each technique, the participant answered a simulator sickness questionnaire, and an overall usability questionnaire.

    Results: The results achieved in this thesis indicated that Arm-Swinging was the most enjoyed locomotion technique in the overall usability questionnaire. But it also showed that Teleportation had the best rating in tiredness and overwhelment. Teleportation also did not cause motion sickness, while the rest of the locomotion techniques did.

    Conclusions: As a conclusion, a difference can be seen for VR locomotion techniques between an environment with low focus on interaction, to an environment with high focus. This difference was seen in both the overall usability questionnaire and the motion sickness questionnaire. It was concluded that Arm-Swinging could be the most fitting VR locomotion technique for an interactive environment, however Teleportation could be more optimal for longer sessions.

  • 21.
    Borg, Anton
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Boldt, Martin
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Svensson, Johan
    Telenor Sverige AB, SWE.
    Using conformal prediction for multi-label document classification in e-Mail support systems2019Inngår i: Lect. Notes Comput. Sci., Springer Verlag , 2019, Vol. 11536, s. 308-322Konferansepaper (Fagfellevurdert)
    Abstract [en]

    For any corporation the interaction with its customers is an important business process. This is especially the case for resolving various business-related issues that customers encounter. Classifying the type of such customer service e-mails to provide improved customer service is thus important. The classification of e-mails makes it possible to direct them to the most suitable handler within customer service. We have investigated the following two aspects of customer e-mail classification within a large Swedish corporation. First, whether a multi-label classifier can be introduced that performs similarly to an already existing multi-class classifier. Second, whether conformal prediction can be used to quantify the certainty of the predictions without loss in classification performance. Experiments were used to investigate these aspects using several evaluation metrics. The results show that for most evaluation metrics, there is no significant difference between multi-class and multi-label classifiers, except for Hamming loss where the multi-label approach performed with a lower loss. Further, the use of conformal prediction did not introduce any significant difference in classification performance for neither the multi-class nor the multi-label approach. As such, the results indicate that conformal prediction is a useful addition that quantifies the certainty of predictions without negative effects on the classification performance, which in turn allows detection of statistically significant predictions. © Springer Nature Switzerland AG 2019.

  • 22.
    Brodd, Adam
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Eriksson, Andreas
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    User perception on procedurally generated cities affected with a heightmapped terrain parameter2019Independent thesis Basic level (university diploma), 10 poäng / 15 hpOppgave
    Abstract [en]

    Context: Procedural content generation shortened PCG is a way of letting the computer algorithmically generate data, with little input from programmers. Procedural content generation is a useful tool for developers to create game worlds, content and much more, which can be tedious and time-consuming to do by hand.Objectives: The procedural generation of both a city and height-mapped terrain parameter using Perlin noise and the terrain parameters effect on the city is explored in this thesis. The objective is to find out if a procedurally generated city with a heightmap parameter using Perlin noise is viable for use in games. Methods: An implementation generating both a height-mapped terrain parameter and city using Perlin noise has been created, along with that a user survey to test the generated city and terrain parameters viability in games. Results: This work successfully implemented an application that can generate cities affected with a heightmapped terrain parameter that is viable for use in games. Conclusions: This work concludes that it is possible to generate cities affected with a height-mapped terrain parameter by utilizing the noise algorithm Perlin noise. The generated cities and terrains are both viable and believable for use in games.

  • 23.
    Carlsson, Anders
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Kuzminykh, Ievgeniia
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Gustavsson, Rune
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Virtual Security Labs Supporting Distance Education in ReSeLa Framework2019Inngår i: Advances in Intelligent Systems and Computing / [ed] Auer M.E.,Tsiatsos T., Springer Verlag , 2019, Vol. 917, s. 577-587Konferansepaper (Fagfellevurdert)
    Abstract [en]

    To meet the high demand of educating the next generation of MSc students in Cyber security, we propose a well-composed curriculum and a configurable cloud based learning support environment ReSeLa. The proposed system is a result of the EU TEMPUS project ENGENSEC and has been extensively validated and tested. © 2019, Springer Nature Switzerland AG.

  • 24.
    Cavallin, Fritjof
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Pettersson, Timmie
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Real-time View-dependent Triangulation of Infinite Ray Cast Terrain2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Background. Ray marching is a technique that can be used to render images of infinite terrains defined by a height field by sampling consecutive points along a ray until the terrain surface is intersected. However, this technique can be expensive, and does not generate a mesh representation, which may be useful in certain use cases.

    Objectives. The aim of the thesis is to implement an algorithm for view-dependent triangulation of infinite terrains in real-time without making use of any preprocessed data, and compare the performance and visual quality of the implementation with that of a ray marched solution.

    Methods. Performance metrics for both implementations are gathered and compared. Rendered images from both methods are compared using an image quality assessment algorithm.

    Results. In all tests performed, the proposed method performs better in terms of frame rate than a ray marched version. The visual similarity between the two methods highly depend on the quality setting of the triangulation.

    Conclusions. The proposed method can perform better than a ray marched version, but is more reliant on CPU processing, and can suffer from visual popping artifacts as the terrain is refined.

  • 25.
    Chapala, Usha Kiran
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Peteti, Sridhar
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Continuous Video Quality of Experience Modelling using Machine Learning Model Trees1996Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Adaptive video streaming is perpetually influenced by unpredictable network conditions, whichcauses playback interruptions like stalling, rebuffering and video bit rate fluctuations. Thisleads to potential degradation of end-user Quality of Experience (QoE) and may make userchurn from the service. Video QoE modelling that precisely predicts the end users QoE underthese unstable conditions is taken into consideration quickly. The root cause analysis for thesedegradations is required for the service provider. These sudden changes in trend are not visiblefrom monitoring the data from the underlying network service. Thus, this is challenging toknow this change and model the instantaneous QoE. For this modelling continuous time, QoEratings are taken into consideration rather than the overall end QoE rating per video. To reducethe user risk of churning the network providers should give the best quality to the users.

    In this thesis, we proposed the QoE modelling to analyze the user reactions change over timeusing machine learning models. The machine learning models are used to predict the QoEratings and change patterns in ratings. We test the model on video Quality dataset availablepublicly which contains the user subjective QoE ratings for the network distortions. M5P modeltree algorithm is used for the prediction of user ratings over time. M5P model gives themathematical equations and leads to more insights by given equations. Results of the algorithmshow that model tree is a good approach for the prediction of the continuous QoE and to detectchange points of ratings. It is shown that to which extent these algorithms are used to estimatechanges. The analysis of model provides valuable insights by analyzing exponential transitionsbetween different level of predicted ratings. The outcome provided by the analysis explains theuser behavior when the quality decreases the user ratings decrease faster than the increase inquality with time. The earlier work on the exponential transitions of instantaneous QoE overtime is supported by the model tree to the user reaction to sudden changes such as video freezes.

  • 26.
    Chen, Xiaoran
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Image enhancement effect on the performance of convolutional neural networks2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Context. Image enhancement algorithms can be used to enhance the visual effects of images in the field of human vision. So can image enhancement algorithms be used in the field of computer vision? The convolutional neural network, as the most powerful image classifier at present, has excellent performance in the field of image recognition. This paper explores whether image enhancement algorithms can be used to improve the performance of convolutional neural networks.

    Objectives. The purpose of this paper is to explore the effect of image enhancement algorithms on the performance of CNN models in deep learning and transfer learning, respectively. The article selected five different image enhancement algorithms, they are the contrast limited adaptive histogram equalization (CLAHE), the successive means of the quantization transform (SMQT), the adaptive gamma correction, the wavelet transform, and the Laplace operator.

    Methods. In this paper, experiments are used as research methods. Three groups of experiments are designed; they respectively explore whether the enhancement of grayscale images can improve the performance of CNN in deep learning, whether the enhancement of color images can improve the performance of CNN in deep learning and whether the enhancement of RGB images can improve the performance of CNN in transfer learning?Results. In the experiment, in deep learning, when training a complete CNN model, using the Laplace operator to enhance the gray image can improve the recall rate of CNN. However, the remaining image enhancement algorithms cannot improve the performance of CNN in both grayscale image datasets and color image datasets. In addition, in transfer learning, when fine-tuning the pre-trained CNN model, using contrast limited adaptive histogram equalization (CLAHE), successive means quantization transform (SMQT), Wavelet transform, and Laplace operator will reduce the performance of CNN.

    Conclusions. Experiments show that in deep learning, using image enhancement algorithms may improve CNN performance when training complete CNN models, but not all image enhancement algorithms can improve CNN performance; in transfer learning, when fine-tuning the pre- trained CNN model, image enhancement algorithms may reduce the performance of CNN.

  • 27.
    Dan, Sjödahl
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Cascaded Voxel Cone-Tracing Shadows: A Computational Performance Study2019Independent thesis Basic level (degree of Bachelor), 10 poäng / 15 hpOppgave
    Abstract [en]

    Background. Real-time shadows in 3D applications have for decades been implemented with a solution called Shadow Mapping or some variant of it. This is a solution that is easy to implement and has good computational performance, nevertheless it does suffer from some problems and limitations. But there are newer alternatives and one of them is based on a technique called Voxel Cone-Tracing. This can be combined with a technique called Cascading to create Cascaded Voxel Cone-Tracing Shadows (CVCTS).

    Objectives. To measure the computational performance of CVCTS to get better insight into it and provide data and findings to help developers make an informed decision if this technique is worth exploring. And to identify where the performance problems with the solution lies.

    Methods. A simple implementation of CVCTS was implemented in OpenGL aimed at simulating a solution that could be used for outdoor scenes in 3D applications. It had several different parameters that could be changed. Then computational performance measurements were made with these different parameters set at different settings.

    Results. The data was collected and analyzed before drawing conclusions. The results showed several parts of the implementation that could potentially be very slow and why this was the case.

    Conclusions. The slowest parts of the CVCTS implementation was the Voxelization and Cone-Tracing steps. It might be possible to use the CVCTS solution in the thesis in for example a game if the settings are not too high but that is a stretch. Little time could be spent during the thesis to optimize the solution and thus it’s possible that its performance could be increased.

  • 28.
    Dasari, Siva Krishna
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap. Blekinge Institute of Technology.
    Tree Models for Design Space Exploration in Aerospace Engineering2019Licentiatavhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    A crucial issue in the design of aircraft components is the evaluation of a larger number of potential design alternatives. This evaluation involves too expensive procedures, consequently, it slows down the search for optimal design samples. As a result, scarce or small number of design samples with high dimensional parameter space and high non-linearity pose issues in learning of surrogate models. Furthermore, surrogate models have more issues in handling qualitative data (discrete) than in handling quantitative data (continuous). These issues bring the need for investigations of methods of surrogate modelling for the most effective use of available data. 

     The thesis goal is to support engineers in the early design phase of development of new aircraft engines, specifically, a component of the engine known as Turbine Rear Structure (TRS). For this, tree-based approaches are explored for surrogate modelling for the purpose of exploration of larger search spaces and for speeding up the evaluations of design alternatives. First, we have investigated the performance of tree models on the design concepts of TRS. Second, we have presented an approach to explore design space using tree models, Random Forests. This approach includes hyperparameter tuning, extraction of parameters importance and if-then rules from surrogate models for a better understanding of the design problem. With this presented approach, we have shown that the performance of tree models improved by hyperparameter tuning when using design concepts data of TRS. Third, we performed sensitivity analysis to study the thermal variations on TRS and hence support robust design using tree models. Furthermore, the performance of tree models has been evaluated on mathematical linear and non-linear functions. The results of this study have shown that tree models fit well on non-linear functions. Last, we have shown how tree models support integration of value and sustainability parameters data (quantitative and qualitative data) together with TRS design concepts data in order to assess these parameters impact on the product life cycle in the early design phase.

     

  • 29.
    Dasari, Siva Krishna
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap. Blekinge Institute of Technology.
    Cheddad, Abbas
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap. Blekinge Institute of Technology.
    Andersson, Petter
    GKN Aerospace Engine Systems, SWE.
    Predictive Modelling to Support Sensitivity Analysis for Robust Design in Aerospace EngineeringInngår i: Artikkel i tidsskrift (Fagfellevurdert)
  • 30.
    Dasari, Siva Krishna
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Cheddad, Abbas
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Andersson, Petter
    GKN Aerospace Engine Systems, SWE.
    Random Forest Surrogate Models to Support Design Space Exploration in Aerospace Use-case2019Inngår i: IFIP Advances in Information and Communication Technology, Springer-Verlag New York, 2019, Vol. 559Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In engineering, design analyses of complex products rely on computer simulated experiments. However, high-fidelity simulations can take significant time to compute. It is impractical to explore design space by only conducting simulations because of time constraints. Hence, surrogate modelling is used to approximate the original simulations. Since simulations are expensive to conduct, generally, the sample size is limited in aerospace engineering applications. This limited sample size, and also non-linearity and high dimensionality of data make it difficult to generate accurate and robust surrogate models. The aim of this paper is to explore the applicability of Random Forests (RF) to construct surrogate models to support design space exploration. RF generates meta-models or ensembles of decision trees, and it is capable of fitting highly non-linear data given quite small samples. To investigate the applicability of RF, this paper presents an approach to construct surrogate models using RF. This approach includes hyperparameter tuning to improve the performance of the RF's model, to extract design parameters' importance and \textit{if-then} rules from the RF's models for better understanding of design space. To demonstrate the approach using RF, quantitative experiments are conducted with datasets of Turbine Rear Structure use-case from an aerospace industry and results are presented.

  • 31.
    Fiati-Kumasenu, Albert
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Extracting Customer Sentiments from Email Support Tickets: A case for email support ticket prioritisation2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Background

    Daily, companies generate enormous amounts of customer support tickets which are grouped and placed in specialised queues, based on some characteristics, from where they are resolved by the customer support personnel (CSP) on a first-in-first-out basis. Given that these tickets require different levels of urgency, a logical next step to improving the effectiveness of the CSPs is to prioritise the tickets based on business policies. Among the several heuristics that can be used in prioritising tickets is sentiment polarity.

    Objectives

    This study investigates how machine learning methods and natural language techniques can be leveraged to automatically predict the sentiment polarity of customer support tickets using.

    Methods

    Using a formal experiment, the study examines how well Support Vector Machine (SVM), Naive Bayes (NB) and Logistic Regression (LR) based sentiment polarity prediction models built for the product and movie reviews, can be used to make sentiment predictions on email support tickets. Due to the limited size of annotated email support tickets, Valence Aware Dictionary and sEntiment Reasoner (VADER) and cluster ensemble - using k-means, affinity propagation and spectral clustering, is investigated for making sentiment polarity prediction.

    Results

    Compared to NB and LR, SVM performs better, scoring an average f1-score of .71 whereas NB scores least with a .62 f1-score. SVM, combined with the presence vector, outperformed the frequency and TF-IDF vectors with an f1-score of .73 while NB records an f1-score of .63. Given an average f1-score of .23, the models transferred from the movie and product reviews performed inadequately even when compared with a dummy classifier with an f1-score average of .55. Finally, the cluster ensemble method outperformed VADER with an f1-score of .61 and .53 respectively.

    Conclusions

    Given the results, SVM, combined with a presence vector of bigrams and trigrams is a candidate solution for extracting sentiments from email support tickets. Additionally, transferring sentiment models from the movie and product reviews domain to the email support tickets is not possible. Finally, given that there exists a limited dataset for conducting sentiment analysis studies in the Swedish and the customer support context, a cluster ensemble is recommended as a sample selection method for generating annotated data.

  • 32.
    Fiedler, Markus
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för teknik och estetik.
    Zepernick, Hans-Juergen
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Kelkkanen, Viktor
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för teknik och estetik.
    Network-induced temporal disturbances in virtual reality applications2019Inngår i: 2019 11th International Conference on Quality of Multimedia Experience, QoMEX 2019, Institute of Electrical and Electronics Engineers Inc. , 2019Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Virtual Reality (VR) applications put high demands on software and hardware in order to enable an immersive experience for the user and avoid causing simulator sickness. As soon as networks become part of the Motion-To-Photon (MTP) path between rendering and display, there is a risk for extraordinary delays that may impair Quality of Experience (QoE). This short paper provides an overview of latency measurements and models that are applicable to the MTP path, complemented by demands on user and network levels. It specifically reports on freeze duration measurements using a commercial TPCAST wireless VR solution, and identifies a corresponding stochastic model of the freeze length distribution, which may serve as disturbance model for VR QoE studies. © 2019 IEEE.

  • 33.
    Floderus, Sebastian
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Rosenholm, Linus
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    An educational experiment in discovering spear phishing attacks2019Independent thesis Basic level (degree of Bachelor), 10 poäng / 15 hpOppgave
    Abstract [en]

    Background: Spear phishing attacks uses social engineering targeting a specific person to steal credential information or infect the users computer with malware. It is often done through emails and it can be very hard to spot the difference between a legitimate email and a scam email. Cybercrime is a growing problem and there is many ways to inform and educate individuals on the subject.Objectives: This study intends to perform an experiment to see if an educationalsupport tool can be used to better identify phishing emails. Furthermore see if there is a difference in susceptibility between students from different university programs. Methods: A qualitative research study was used to get the necessary understanding how to properly develop a phishing educational tool. A Pretest-Posttest experiment is done to see if there is an improvement in result between an experimental group that received education and the control group that did not. Results: The result shows an overall higher score for the technical program compared to the non-technical. Comparing the pretest with the posttest shows an increase in score for the non-technical program and a decrease in score for the technical program. Furthermore 58% of the non-technical students who started the test did not complete it. Conclusions: There is a noticeable difference in susceptibility between the programs for detecting scam emails for students. However further research is needed in order to explore to what extent the education process had an impact.

  • 34.
    Folino, Emil
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Självrättande programmeringstenta2019Rapport (Annet vitenskapelig)
    Abstract [sv]

    Hur kan vi på bästa sätt examinera grundläggande programmeringskunskaper

    i en inledande programmeringskurs? Vi skapade en självrättande

    examinationsform där studenterna under tentan kan få feedback och

    ökade genomströmningen med 20%.

  • 35.
    Fransson, Jonatan
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Hiiriskoski, Teemu
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Measuring Immersion and Enjoyment in a 2D Top-Down Game by Replacing the Mouse Input with Eye Tracking2019Independent thesis Basic level (degree of Bachelor), 10 poäng / 15 hpOppgave
    Abstract [en]

    Background. Eye tracking has been evaluated and tried in different 2D settings for research purposes. Most commercial games that are using eye tracking use it as an assistive extra input method and are focused around third or first person. There are few 2D games developed with eye tracking as an input method. This thesis aims to test the use of eye tracking as a replacement input method with a chosen set of mechanics for the purpose of playing a 2D top-down game with eye tracking as the main input method.

    Objectives. To test eye tracking in a 2D top-down game and use it as a replacement input method for the mouse in a novel effort to evaluate immersion and enjoyment.

    Method. To conduct this study the Tobii 4C eye tracker is used as the replacement peripheral in a 2D game prototype developed for the study. The game prototype is developed with the Unity game engine which the participants played through twice with a different input mode each time. Once with a keyboard and mouse and a second time with a keyboard and an eye tracker. The participants played different modes in alternating order to not sway the results. For the game prototype three different mechanics were implemented, to aim, search for hidden items and remove shadows. To measure immersion and enjoyment an experiment was carried out in a controlled manner, letting participants play through the game prototype and evaluating their experience. To evaluate the experience the participants answered a questionnaire with 12 questions relating to their perceived immersion and a small interview with 5 questions about their experience and perceived enjoyment. The study had a total of 12 participants.

    Results. The results from the data collected through the experiment indicate that the participants enjoyed and felt more involvement in the game, 10 out of 12 participants answering that they felt more involved with the game using eye tracking compared to the mouse. Analyzing the interviews, the participants stated that eye tracking made the game more difficult and less natural to control compared to the mouse. There is a potential problem that might sway the results toward eye tracking, most participants stated that eye tracking is a new experience and none of the participants had used it to play video games before.

    Conclusions. The results from the questionnaire prove the hypothesis with statistics, with a p-value of 0.02 < 5% for both increased involvement and enjoyment using eye tracking. Although the result might be biased due to the participant's inexperience with eye tracking in video games. Most of the participants reacted positively towards eye tracking with the most common reason being that it was a new experience to them.

  • 36.
    García Martín, Eva
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Rodrigues, Crefeda Faviola
    University of Manchester, GBR.
    Riley, Graham
    University of Manchester, GBR.
    Grahn, Håkan
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Estimation of energy consumption in machine learning2019Inngår i: Journal of Parallel and Distributed Computing, ISSN 0743-7315, E-ISSN 1096-0848, Vol. 134, s. 75-88Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Energy consumption has been widely studied in the computer architecture field for decades. While the adoption of energy as a metric in machine learning is emerging, the majority of research is still primarily focused on obtaining high levels of accuracy without any computational constraint. We believe that one of the reasons for this lack of interest is due to their lack of familiarity with approaches to evaluate energy consumption. To address this challenge, we present a review of the different approaches to estimate energy consumption in general and machine learning applications in particular. Our goal is to provide useful guidelines to the machine learning community giving them the fundamental knowledge to use and build specific energy estimation methods for machine learning algorithms. We also present the latest software tools that give energy estimation values, together with two use cases that enhance the study of energy consumption in machine learning.

  • 37.
    García-Martín, Eva
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Energy Efficiency in Machine Learning: Approaches to Sustainable Data Stream Mining2020Doktoravhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    Energy efficiency in machine learning explores how to build machine learning algorithms and models with low computational and power requirements. Although energy consumption is starting to gain interest in the field of machine learning, still the majority of solutions focus on obtaining the highest predictive accuracy, without a clear focus on sustainability.

    This thesis explores green machine learning, which builds on green computing and computer architecture to design sustainable and energy efficient machine learning algorithms. In particular, we investigate how to design machine learning algorithms that automatically learn from streaming data in an energy efficient manner.

    We first illustrate how energy can be measured in the context of machine learning, in the form of a literature review and a procedure to create theoretical energy models. We use this knowledge to analyze the energy footprint of Hoeffding trees, presenting an energy model that maps the number of computations and memory accesses to the main functionalities of the algorithm. We also analyze the hardware events correlated to the execution of the algorithm, their functions and their hyper parameters.

    The final contribution of the thesis is showcased by two novel extensions of Hoeffding tree algorithms, the Hoeffding tree with nmin adaptation and the Green Accelerated Hoeffding Tree. These solutions are able to reduce their energy consumption by twenty and thirty percent, with minimal effect on accuracy. This is achieved by setting an individual splitting criteria for each branch of the decision tree, spending more energy on the fast growing branches and saving energy on the rest.

    This thesis shows the importance of evaluating energy consumption when designing machine learning algorithms, proving that we can design more energy efficient algorithms and still achieve competitive accuracy results.

  • 38.
    Ginka, Anusha
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Salapu, Venkata Satya Sameer
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Optimization of Packet Throughput in Docker Containers2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Container technology has gained popularity in recent years, mainly because it enables a fast and easy way to package, distribute and deploy applications and services. Latency and throughput have a high impact on user satisfaction in many real-time, critical and large-scale online services. Although the use of microservices architecture in cloud-native applications has enabled advantages in terms of application resilience, scalability, fast software delivery and the use of minimal resources, the packet processing rates are not correspondingly higher. This is mainly due to the overhead imposed by the design and architecture of the network stack. Packet processing rates can be improved by making changes to the network stack and without necessarily adding more powerful hardware.

    In this research, a study of various high-speed packet processing frameworks is presented and a software high-speed packet I/O solution i.e., as hardware agnostic as possible to improve the packet throughput in container technology is identified. The proposed solution is identified based on if the solution involves making changes to the underlying hardware or not. The proposed solution is then evaluated in terms of packet throughput for different container networking modes. A comparison of the proposed solution with a simple UDP client-server application is also presented for different container networking modes. From the results obtained, it is concluded that packet mmap client-server application has higher performance when compared with simple UDP client-server application.

  • 39.
    Goswami, Prashant
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap. BTH.
    Interactive animation of single-layer cumulus clouds using cloudmap2019Inngår i: Eurographics Proceedings STAG: Smart Tools and Applications in Graphics (2019) / [ed] M. Agus, M. Corsini and R. Pintus, Eurographics - European Association for Computer Graphics, 2019Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In this paper, we present a physics-driven procedural method for the interactive animation of realistic, single-layered cumulus clouds for the landscape-scale size. Our method employs the coarse units called parcels for the physics simulation and achieves procedural micro-level volumetric amplification based on the macro physics parameters. However, contrary to the previous methods which achieve amplification directly inside the parcels, we make use of the two-dimensional texture called cloud mapsto this end. This not only improves the shape and distribution of the cloud cover over the landscape but also boosts the animation efficiency significantly, allowing the overall approach to run at high frame rates, which is verified by the experiments presented in the paper.

  • 40.
    Goswami, Prashant
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap. BTH.
    Markowicz, Christian
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Hassan, Ali
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Real-time particle-based snow simulation on the GPU2019Inngår i: Eurographics Symposium on Parallel Graphics and Visualization / [ed] Hank Childs and Stefan Frey, Porto: Eurographics - European Association for Computer Graphics, 2019Konferansepaper (Fagfellevurdert)
    Abstract [en]

    This paper presents a novel real-time particle-based method for simulating snow on the GPU. Our method captures compressionand bonding between snow particles, and incorporates the thermodynamics to model the realistic behavior of snow. Thepresented technique is computationally inexpensive, and is capable of supporting rendering in addition to physics simulation athigh frame rates. The method is completely parallel and is implemented using CUDA. High efficiency and its simplicity makesour method an ideal candidate for integration in existing game SDK frameworks.

  • 41.
    Gummesson, Simon
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Johnson, Mikael
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Parallel Construction of Local Clearance Triangulations2019Independent thesis Advanced level (degree of Master (One Year)), 20 poäng / 30 hpOppgave
    Abstract [en]

    The usage of navigation meshes for path planning in games and otherdomains is a common approach. One type of navigation mesh that recently has beendeveloped is the Local Clearance Triangulation (LCT). The overall aim of the LCT isto construct a triangulation in such a way that a property called theLocal Clearancecan be used to calculate a path in a more efficient and cheap way. At the time ofwriting the thesis there only exists one solution that creates an LCT, this solution isonly using the CPU. Since the process of creating an LCT involves the insertion ofmany points and edge flips which only affects a local area it would be interesting toinvestigate the potential performance gain of using the GPU.

    The objective of the thesis is to develop a GPU version based on thecurrent CPU LCT solution and to investigate in which cases the proposed GPU al-gorithm performs better.

    A GPU version and a CPU version of the proposed algorithm has beendeveloped to measure the performance gain of using the GPU, there are no algorith-mic differences between these versions. To measure the performance of the algorithmtwo tests have been constructed, the first test is called the Object Insertion test andmeasures the time it takes to build an LCT using generated test maps. The sec-ond test is called the Internal test and measures the internal performance of thealgorithm. A comparison between the GPU algorithm with an LCT library called Triplanner was also done.

    The proposed algorithm performed better on larger maps when imple-mented on a GPU compared to a CPU implementation of the algorithm. The GPU performance compared to the Triplanner was faster in some of the larger maps.

    An algorithm that builds an LCT from scratch is presented. Theresults show that using the proposed algorithm on the GPU substantially increasesthe performance of the algorithm compared to when implementing it on a CPU.

  • 42.
    Gummesson, Simon
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Johnson, Mikael
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Parallel Construction of Local Clearance Triangulations2019Independent thesis Advanced level (degree of Master (One Year)), 20 poäng / 30 hpOppgave
    Abstract [en]

    The usage of navigation meshes for path planning in games and otherdomains is a common approach. One type of navigation mesh that recently has beendeveloped is the Local Clearance Triangulation (LCT). The overall aim of the LCT isto construct a triangulation in such a way that a property called theLocal Clearancecan be used to calculate a path in a more efficient and cheap way. At the time ofwriting the thesis there only exists one solution that creates an LCT, this solution isonly using the CPU. Since the process of creating an LCT involves the insertion ofmany points and edge flips which only affects a local area it would be interesting toinvestigate the potential performance gain of using the GPU.Objectives.The objective of the thesis is to develop a GPU version based on thecurrent CPU LCT solution and to investigate in which cases the proposed GPU al-gorithm performs better.Methods.A GPU version and a CPU version of the proposed algorithm has beendeveloped to measure the performance gain of using the GPU, there are no algorith-mic differences between these versions. To measure the performance of the algorithmtwo tests have been constructed, the first test is called the Object Insertion test andmeasures the time it takes to build an LCT using generated test maps. The sec-ond test is called the Internal test and measures the internal performance of thealgorithm. A comparison between the GPU algorithm with an LCT library calledTriplanner was also done.Results.The proposed algorithm performed better on larger maps when imple-mented on a GPU compared to a CPU implementation of the algorithm. The GPUperformance compared to the Triplanner was faster in some of the larger maps.Conclusions.An algorithm that builds an LCT from scratch is presented. Theresults show that using the proposed algorithm on the GPU substantially increasesthe performance of the algorithm compared to when implementing it on a CPU.

  • 43.
    Gummesson, Simon
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Johnson, Mikael
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Parallel Construction of LocalClearance TriangulationsIndependent thesis Advanced level (degree of Master (One Year)), 20 poäng / 30 hpOppgave
    Abstract [en]

    The usage of navigation meshes for path planning in games and otherdomains is a common approach. One type of navigation mesh that recently has beendeveloped is the Local Clearance Triangulation (LCT). The overall aim of the LCT isto construct a triangulation in such a way that a property called theLocal Clearancecan be used to calculate a path in a more efficient and cheap way. At the time ofwriting the thesis there only exists one solution that creates an LCT, this solution isonly using the CPU. Since the process of creating an LCT involves the insertion ofmany points and edge flips which only affects a local area it would be interesting toinvestigate the potential performance gain of using the GPU.Objectives.The objective of the thesis is to develop a GPU version based on thecurrent CPU LCT solution and to investigate in which cases the proposed GPU al-gorithm performs better.Methods.A GPU version and a CPU version of the proposed algorithm has beendeveloped to measure the performance gain of using the GPU, there are no algorith-mic differences between these versions. To measure the performance of the algorithmtwo tests have been constructed, the first test is called the Object Insertion test andmeasures the time it takes to build an LCT using generated test maps. The sec-ond test is called the Internal test and measures the internal performance of thealgorithm. A comparison between the GPU algorithm with an LCT library calledTriplanner was also done.Results.The proposed algorithm performed better on larger maps when imple-mented on a GPU compared to a CPU implementation of the algorithm. The GPUperformance compared to the Triplanner was faster in some of the larger maps.Conclusions.An algorithm that builds an LCT from scratch is presented. Theresults show that using the proposed algorithm on the GPU substantially increasesthe performance of the algorithm compared to when implementing it on a CPU.

  • 44.
    Guo, Yang
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Heterogeneous Knowledge Sharing in eHealth: Modeling, Validation and Application2019Doktoravhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    Knowledge sharing has become an important issue in the eHealth field for improving the quality of healthcare service. However, since eHealth subject is a multidisciplinary and cross-organizational area, knowledge sharing is a serious challenge when it comes to developing eHealth systems. Thus, this thesis studies the heterogeneous knowledge sharing in eHealth and proposes a knowledge sharing ontology. The study consists of three main parts: modeling, validation and application.

    In the modeling part, knowledge sharing in eHealth is studied from two main aspects: the first aspect is the heterogeneous knowledge of different healthcare actors, and the second aspect is the interactivities among various healthcare actors. In this part, the contribution is to propose an Activity Theory based Ontology (ATO) model to highlight and represent these two aspects of eHealth knowledge sharing, which is helpful for designing efficient eHealth systems.

    In the validation part, a questionnaire based survey is conducted to practically validate the feasibility of the proposed ATO model. The survey results are analyzed to explore the effectiveness of the proposed model for designing efficient knowledge sharing in eHealth. Further, a web based software prototype is constructed to validate the applicability of the ATO model for practical eHealth systems. In this part, the contribution is to explore and show how the proposed ATO model can be validated.

    In the application part, the importance and usefulness of applying the proposed ATO model to solve two real problems are addressed. These two problems are healthcare decision making and appointment scheduling. There is a similar basic challenge in both these problems: a healthcare provider (e.g., a doctor) needs to provide optimal healthcare service (e.g., suitable medicine or fast treatment) to a healthcare receiver (e.g., a patient). Here, the optimization of the healthcare service needs to be achieved in accordance with eHealth knowledge which is distributed in the system and needs to be shared, such as the doctor’s competence, the patient’s health status, and priority control on patients’ diseases. In this part, the contribution is to propose a smart system called eHealth Appointment Scheduling System (eHASS) based on ATO model.

    This research work has been presented in eight conference and journal papers, which, along with an introductory chapter, are included in this compilation thesis.

  • 45.
    Guo, Yang
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap. Blekinge institute of Technology.
    Yao, Yong
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    On Performance of Prioritized Appointment Scheduling for Healthcare2019Inngår i: Journal of Service Science and Management, ISSN 1940-9893, E-ISSN 1940-9907, Vol. 12, s. 589-604Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Designing the appointment scheduling is a challenging task for the development of healthcare system. The efficient solution approach can provide high-quality healthcare service between care providers (CP)s and care receivers (CR)s. In this paper, we consider the healthcare system with the heterogeneous CRs in terms of urgent and routine CRs. Our suggested model assumes that the system gives the service priority to the urgent CRs by allowing them to interrupt the ongoing routine appointments. An appointment handoff scheme is suggested for the interrupted routine appointments, and thus the routine CRs can attempt to re-establish the appointment scheduling with other available CPs. With these considerations, we study the scheduling performance of the system by using the Markov chains based modeling approach. The numerical analysis is reported and the simulation experiment is conducted to validate the numerical results.

  • 46.
    Gurram, Karthik
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Chappidi, Maheshwar Reddy
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    A Search-Based Approach for Robustness Testing of Web Applications2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Context: This thesis deals with the robustness testing of web applications on a different web browser using a Selenium WebDriver to automate the browser. To increase the efficiency of this automation testing, we are using a robustness method. Robustness method is a process of testing the behaviour of a system implementation under exceptional execution conditions to check if it still fulfils some robustness requirements. These robustness tests often apply random algorithms to select the actions to be executed on web applications. The search-based technique was used to automatically generate effective test cases, consisting of initial conditions and fault sequences. The success criteria in most cases: "if it does not crash or hang application, then it is robust".

    Problem: Software testing consumes a lot of time, labour-intensive to write test cases and expensive in a software development life cycle. There was always a need for software testing to decrease the testing time. Manual testing requires a lot of effort and hard work if we measure in terms of person per month [1]. To overcome this problem, we are using a search-based approach for robustness testing of web applications which can dramatically reduce the human effort, time and the costs related to testing.

    Objective: The purpose of this thesis is to develop an automated approach to carry out robustness testing of web applications focusing on revealing defects related to a sequence of events triggered by a web system. To do so, we will employ search-based techniques (e.g., NSGA-II algorithm [1]). The main focus is on Ericsson Digital BSS systems, with a special focus on robustness testing. The main purpose of this master thesis is to investigate how automated robustness testing can be done so that the effort of keeping the tests up to date is minimized when the functionality of the application changes. This kind of automation testing is well depended on the structure of the product being tested. In this thesis, the test object was structured in a way, which made the testing method simple for fault revelation and less time-consuming.

    Method: For this approach, a meta-heuristic search-based genetic algorithm is used to make efficiency for robustness testing of the web application. In order to evaluate the effectiveness of this proposed approach, the experimental procedure is adapted. For this, an experimental testbed is set up. The effectiveness of the proposed approach is measured by two objectives: Fault revelation, Test sequence length. The effectiveness is also measured by evaluating the feasible cost-effective output test cases. i Results:The results we collected from our approach shows that by reducing the test sequence length we can reduce the time consuming and by using the NSGA-2 algorithm we found as many faults as we can when we tested on web applications in Ericsson.

    Conclusion: The attempt of testing of web applications, was partly succeeded. This kind of robustness testing in our approach was strongly depended on the algorithm we are using. We can conclude that by using these two objectives, we can reduce the cost of testing and time consuming.

  • 47.
    Gustafsson, Jacob
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Törnkvist, Adam
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Secure handling of encryption keys for small businesses: A comparative study of key management systems2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Background: A recent study shows that key management in the cooperate world is very painful due to, among other reasons, a lack of knowledge and resources. Instead, some companies embed the encryption keys and other software secrets directly in the source code for the application that uses them, introducing the risk of exposing the secrets. Today, there are multiple systems for managing keys. However, it can be hard to pick a suitable one.

    Objectives: The objectives of the thesis are to identify available key management systems for securing secrets in software, evaluate their eligibility to be used by small businesses based on various attributes and recommend a best practice to configure the most suited system for managing software secrets.

    Methods: Key management systems are identified through an extensive search, using both scientific and non-scientific search engines. Identified key management systems were compared against a set of requirements created from a small business perspective. The systems that fulfilled the requirements were implemented and comprehensively evaluated through SWOT analyses based on various attributes. Each system was then scored and compared against each other based on these attributes. Lastly, a best practice guide for the most suitable key management system was established.

    Results: During the thesis, a total of 54 key management systems were identified with various features and purposes. Out of these 54 systems, five key management systems were comprehensively compared. These were Pinterest Knox, Hashicorp Vault, Square Keywhiz, OpenStack Barbican, and Cyberark Conjur. Out of these five, Hachicorp Vault was deemed to be the most suitable system for small businesses.

    Conclusions: There is currently a broad selection of key management systems available. The quality, price, and intended use of these vary, which makes it time-consuming to identify the system that is best suitable based on the needs. The thesis concludes Hachicorp Vault to be the most suitable system based on the needs presented. However, the thesis can also be used by businesses with other needs as a guideline to aid the problem of choosing a key management system.

  • 48.
    Heiding, John
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Increasing Phenotype Diversity In Terrain Generation Using Fourier Transform: Implementation of Fourier transform as an intermediate phenotype for genetic algorithms2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Context. Creating resources for games and 3D environments is an effort consuming process. Some are looking to procedural algorithms to aid in this endeavour but the effort to configure the algorithms can be time consuming in itself. This paper will continue from a set of papers written by Frade et al. where they surrender the process of configuration to the algorithm by using genetic optimization together with a set of fitness functions. This is then tested on procedural generation of height maps.Objectives. The original algorithm utilizes a tree of functions that generates height maps using genetic optimization and a set of fitness functions. The output of the original algorithm is highly dependent on a specic noise function.This paper will investigate if the inverse Fourier transform can be used as an intermediate phenotype in order to decrease the relationship between the set of functions in the algorithm and the types of output.Methods. A reference implementation was first produced and verified. The Fourier transform was then added to the algorithm as an intermediate phenotype together with improvements on the original algorithm. The new algorithm was then put to the test via five experiments, where the output was compared with the reference implementation using manual review.Results. The implementation of Fourier transform that was attempted in this paper exclusively produced noisy output.Conclusions. The modified algorithm did not produce viable output. This most likely due to the behaviour of the Fourier transform in itself and in relation to the implementation of fitness calculation.

  • 49.
    Henesey, Lawrence
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Lizneva, Yulia
    Student.
    Anwar, Mahwish
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    A multi-agent system with blockchain for container stacking and dispatching.2019Inngår i: 21st International Conference on Harbor, Maritime and Multimodal Logistics Modeling and Simulation, HMS 2019, Dime University of Genoa , 2019, s. 79-87Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Port Logistical Supply chains play a very important role in society. Their complex and adaptive behaviours promote the suggested applications of combining a multiagent system with blockchain for solving complex problems. Several technologies have been proven positively to work in logistics, however the concept of combining converging technologies such as blockchain with deep reinforcement multi agent is viewed as a novel approach to solving the complexity that is associated with many facets of logistics. A simulator was developed and tested for the problem of container stacking. The simulation results indicate a more robust approach to currently used tools and methods. © Harbor, Maritime and Multimodal Logistics Modeling and Simulation, HMS 2019.All Rights Reserved.

  • 50.
    Isenstierna, Tobias
    et al.
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Popovic, Stefan
    Blekinge Tekniska Högskola, Fakulteten för datavetenskaper, Institutionen för datavetenskap.
    Computer systems in airborne radar: Virtualization and load balancing of nodes2019Independent thesis Advanced level (degree of Master (Two Years)), 20 poäng / 30 hpOppgave
    Abstract [en]

    Introduction. For hardware used in radar systems of today, technology is evolving in an increasing rate. For existing software in radar systems, relying on specific drivers or hardware, this quickly becomes a problem. When hardware required is no longer produced or outdated, compatibility problems emerges between the new hardware and existing software. This research will focus on exploring if the virtualization technology can be helpful in solving this problem. Would it be possible to address the compatibility problem with the help of hypervisor solutions, while also maintaining high performance?

    Objectives. The aim with this research is to explore the virtualization technology with focus on hypervisors, to improve the way that hardware and software cooperate within a radar system. The research will investigate if it is possible to solve compatibility problems between new hardware and already existing software, while also analysing the performance of virtual solutions compared to non-virtualized.

    Methods. The proposed method is an experiment were the two hypervisors Xen and KVM will analysed. The hypervisors will be running on two different systems. A native environment with similarities to a radar system will be built and then compared with the same system, but now with hypervisor solutions applied. Research around the area of virtualization will be conducted with focus on security, hypervisor features and compatibility.

    Results. The results will present a proposed virtual environment setup with the hypervisors installed. To address the compatibility issue, an old operating system has been used to prove that implemented virtualization works. Finally performance results are presented for the native environment compared against a virtual environment.

    Conclusions. From results gathered with benchmarks, we can see that the individual performance might vary, which is to be expected when used on different hardware. A virtual setup has been built, including Xen and KVM hypervisors, together with NAS communication. Running an old operating system as a virtual guest, compatibility has been proven to exist between software and hardware using KVM as the virtual solution. From the results gathered, KVM seems like a good solution to investigate more.

123 1 - 50 of 121
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf