Change search
Refine search result
1234567 101 - 150 of 1523
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 101.
    Badampudi, Deepika
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Reporting Ethics Considerations in Software Engineering Publications2017In: 11TH ACM/IEEE INTERNATIONAL SYMPOSIUM ON EMPIRICAL SOFTWARE ENGINEERING AND MEASUREMENT (ESEM 2017), IEEE , 2017, p. 205-210Conference paper (Refereed)
    Abstract [en]

    Ethical guidelines of software engineering journals require authors to provide statements related to the conflict of interest and the process of obtaining consent (if human subjects are involved). The objective of this study is to review the reporting of the ethical considerations in Empirical Software Engineering - An International Journal. The results indicate that two out of seven studies reported some ethical information however, not explicitly. The ethical discussions were focussed on anonymity and confidentiality. Ethical aspects such as competence, comprehensibility and vulnerability of the subjects were not discussed in any of the papers reviewed in this study. It is important to not only state that consent was obtained however, the procedure of obtaining consent should be reported to improve the accountability and trust.

  • 102.
    Badampudi, Deepika
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Towards decision-making to choose among different component origins2016Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Context: The amount of software in solutions provided in various domains is continuously growing. These solutions are a mix of hardware and software solutions, often referred to as software-intensive systems. Companies seek to improve the software development process to avoid delays or cost overruns related to the software development.  

    Objective: The overall goal of this thesis is to improve the software development/building process to provide timely, high quality and cost efficient solutions. The objective is to select the origin of the components (in-house, outsource, components off-the-shelf (COTS) or open source software (OSS)) that facilitates the improvement. The system can be built of components from one origin or a combination of two or more (or even all) origins. Selecting a proper origin for a component is important to get the most out of a component and to optimize the development. 

    Method: It is necessary to investigate the component origins to make decisions to select among different origins. We conducted a case study to explore the existing challenges in software development.  The next step was to identify factors that influence the choice to select among different component origins through a systematic literature review using a snowballing (SB) strategy and a database (DB) search. Furthermore, a Bayesian synthesis process is proposed to integrate the evidence from literature into practice.  

    Results: The results of this thesis indicate that the context of software-intensive systems such as domain regulations hinder the software development improvement. In addition to in-house development, alternative component origins (outsourcing, COTS, and OSS) are being used for software development. Several factors such as time, cost and license implications influence the selection of component origins. Solutions have been proposed to support the decision-making. However, these solutions consider only a subset of factors identified in the literature.   

    Conclusions: Each component origin has some advantages and disadvantages. Depending on the scenario, one component origin is more suitable than the others. It is important to investigate the different scenarios and suitability of the component origins, which is recognized as future work of this thesis. In addition, the future work is aimed at providing models to support the decision-making process.

  • 103.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Claes, Wohlin
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Kai, Petersen
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Software Component Decision-making: In-house, OSS, COTS or Outsourcing: A Systematic Literature Review2016In: Journal of Systems and Software, ISSN 0164-1212, E-ISSN 1873-1228, Vol. 121, p. 105-124Article in journal (Refereed)
    Abstract [en]

    Component-based software systems require decisions on component origins for acquiring components. A component origin is an alternative of where to get a component from. Objective: To identify factors that could influence the decision to choose among different component origins and solutions for decision-making (For example, optimization) in the literature. Method: A systematic review study of peer-reviewed literature has been conducted. Results: In total we included 24 primary studies. The component origins compared were mainly focused on in-house vs. COTS and COTS vs. OSS. We identified 11 factors affecting or influencing the decision to select a component origin. When component origins were compared, there was little evidence on the relative (either positive or negative) effect of a component origin on the factor. Most of the solutions were proposed for in-house vs. COTS selection and time, cost and reliability were the most considered factors in the solutions. Optimization models were the most commonly proposed technique used in the solutions. Conclusion: The topic of choosing component origins is a green field for research, and in great need of empirical comparisons between the component origins, as well of how to decide between different combinations of them.

    The full text will be freely available from 2019-11-01 12:16
  • 104.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wnuk, Krzysztof
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Franke, Ulrik
    Swedish Institute of Computer Science, SWE.
    Šmite, Darja
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Cicchetti, Antonio
    Mälardalens högskola, SWE.
    A decision-making process-line for selection of software asset origins and components2018In: Journal of Systems and Software, ISSN 0164-1212, E-ISSN 1873-1228, Vol. 135, p. 88-104Article in journal (Refereed)
    Abstract [en]

    Selecting sourcing options for software assets and components is an important process that helps companies to gain and keep their competitive advantage. The sourcing options include: in-house, COTS, open source and outsourcing. The objective of this paper is to further refine, extend and validate a solution presented in our previous work. The refinement includes a set of decision-making activities, which are described in the form of a process-line that can be used by decision-makers to build their specific decision-making process. We conducted five case studies in three companies to validate the coverage of the set of decision-making activities. The solution in our previous work was validated in two cases in the first two companies. In the validation, it was observed that no activity in the proposed set was perceived to be missing, although not all activities were conducted and the activities that were conducted were not executed in a specific order. Therefore, the refinement of the solution into a process-line approach increases the flexibility and hence it is better in capturing the differences in the decision-making processes observed in the case studies. The applicability of the process-line was then validated in three case studies in a third company. © 2017 Elsevier Inc.

    The full text will be freely available from 2020-01-01 12:19
  • 105.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Bayesian Synthesis for Knowledge Translation in Software Engineering: Method and Illustration2016In: 2016 42th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), IEEE, 2016Conference paper (Refereed)
    Abstract [en]

    Systematic literature reviews in software engineering are necessary to synthesize evidence from multiple studies to provide knowledge and decision support. However, synthesis methods are underutilized in software engineering research. Moreover, translation of synthesized data (outcomes of a systematic review) to provide recommendations for practitioners is seldom practiced. The objective of this paper is to introduce the use of Bayesian synthesis in software engineering research, in particular to translate research evidence into practice by providing the possibility to combine contextualized expert opinions with research evidence. We adopted the Bayesian synthesis method from health research and customized it to be used in software engineering research. The proposed method is described and illustrated using an example from the literature. Bayesian synthesis provides a systematic approach to incorporate subjective opinions in the synthesis process thereby making the synthesis results more suitable to the context in which they will be applied. Thereby, facilitating the interpretation and translation of knowledge to action/application. None of the synthesis methods used in software engineering allows for the integration of subjective opinions, hence using Bayesian synthesis can add a new dimension to the synthesis process in software engineering research.

  • 106.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Gorschek, Tony
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Guidelines for Knowledge Translation in Software EngineeringIn: Article in journal (Refereed)
  • 107.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Petersen, Kai
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Experiences from Using Snowballing and Database Searches in Systematic Literature Studies2015Conference paper (Refereed)
    Abstract [en]

    Background: Systematic literature studies are commonly used in software engineering. There are two main ways of conducting the searches for these type of studies; they are snowballing and database searches. In snowballing, the reference list (backward snowballing - BSB) and citations (forward snowballing - FSB) of relevant papers are reviewed to identify new papers whereas in a database search, different databases are searched using predefined search strings to identify new papers. Objective: Snowballing has not been in use as extensively as database search. Hence it is important to evaluate its efficiency and reliability when being used as a search strategy in literature studies. Moreover, it is important to compare it to database searches. Method: In this paper, we applied snowballing in a literature study, and reflected on the outcome. We also compared database search with backward and forward snowballing. Database search and snowballing were conducted independently by different researchers. The searches of our literature study were compared with respect to the efficiency and reliability of the findings. Results: Out of the total number of papers found, snowballing identified 83% of the papers in comparison to 46% of the papers for the database search. Snowballing failed to identify a few relevant papers, which potentially could have been addressed by identifying a more comprehensive start set. Conclusion: The efficiency of snowballing is comparable to database search. It can potentially be more reliable than a database search however, the reliability is highly dependent on the creation of a suitable start set.

  • 108.
    Bai, Guohua
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    An Organic View of Prototyping in Information System Development2014In: 2014 IEEE 17th International Conference on Computational Science and Engineering (CSE) / [ed] Liu, X; ElBaz, D; Hsu, CH; Kang, K; Chen, W, ChengDu: IEEE, 2014, Vol. Article number 07023844, p. 1814-1818Conference paper (Refereed)
    Abstract [en]

    This paper presents an organic view of prototyping for managing dynamic factors involved in evolutionary design of information systems (IS). Those dynamic factors can be caused by, for example, continuing suggestions from users, changes in the technologies, and users-designers learning related stepwise progresses. Expanding the evolutionary prototyping to ‘start small and grow’, the organic view of prototyping proposes two prerequisites to do so, namely 1) a sustainable and adaptive ‘embryo’ – an organic structure of the future system, and 2) an embedded learning and feedback management that the actors of the system (users, designers, decision makers, administrators) can communicate with each other. An example of eHealth system design demonstrates how the prerequisites can be implemented.

  • 109.
    Bakhtyar, Shoaib
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Designing Electronic Waybill Solutions for Road Freight Transport2016Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    In freight transportation, a waybill is an important document that contains essential information about a consignment. The focus of this thesis is on a multi-purpose electronic waybill (e-Waybill) service, which can provide the functions of a paper waybill, and which is capable of storing, at least, the information present in a paper waybill. In addition, the service can be used to support other existing Intelligent Transportation System (ITS) services by utilizing on synergies with the existing services. Additionally, information entities from the e-Waybill service are investigated for the purpose of knowledge-building concerning freight flows.

    A systematic review on state-of-the-art of the e-Waybill service reveals several limitations, such as limited focus on supporting ITS services. Five different conceptual e-Waybill solutions (that can be seen as abstract system designs for implementing the e-Waybill service) are proposed. The solutions are investigated for functional and technical requirements (non-functional requirements), which can potentially impose constraints on a potential system for implementing the e-Waybill service. Further, the service is investigated for information and functional synergies with other ITS services. For information synergy analysis, the required input information entities for different ITS services are identified; and if at least one information entity can be provided by an e-Waybill at the right location we regard it to be a synergy. Additionally, a service design method has been proposed for supporting the process of designing new ITS services, which primarily utilizes on functional synergies between the e-Waybill and different existing ITS services. The suggested method is applied for designing a new ITS service, i.e., the Liability Intelligent Transport System (LITS) service. The purpose of the LITS service isto support the process of identifying when and where a consignment has been damaged and who was responsible when the damage occurred. Furthermore, information entities from e-Waybills are utilized for building improved knowledge concerning freight flows. A freight and route estimation method has been proposed for building improved knowledge, e.g., in national road administrations, on the movement of trucks and freight.

    The results from this thesis can be used to support the choice of practical e-Waybill service implementation, which has the possibility to provide high synergy with ITS services. This may lead to a higher utilization of ITS services and more sustainable transport, e.g., in terms of reduced congestion and emissions. Furthermore, the implemented e-Waybill service can be an enabler for collecting consignment and traffic data and converting the data into useful traffic information. In particular, the service can lead to increasing amounts of digitally stored data about consignments, which can lead to improved knowledge on the movement of freight and trucks. The knowledge may be helpful when making decisions concerning road taxes, fees, and infrastructure investments.

  • 110.
    Bakhtyar, Shoaib
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Ghazi, Ahmad Nauman
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    On Improving Research Methodology Course at Blekinge Institute of Technology2016Conference paper (Refereed)
    Abstract [en]

    The Research Methodology in Software Engineering and Computer Science (RM) is a compulsory course that must be studied by graduate students at Blekinge Institute of Technology (BTH) prior to undertaking their theses work. The course is focused on teaching research methods and techniques for data collection and analysis in the fields of Computer Science and Software Engineering. It is intended that the course should help students in practically applying appropriate research methods in different courses (in addition to the RM course) including their Master’s theses. However, it is believed that there exist deficiencies in the course due to which the course implementation (learning and assessment activities) as well as the performance of different participants (students, teachers, and evaluators) are affected negatively. In this article our aim is to investigate potential deficiencies in the RM course at BTH in order to provide a concrete evidence on the deficiencies faced by students, evaluators, and teachers in the course. Additionally, we suggest recommendations for resolving the identified deficiencies. Our findings gathered through semi-structured interviews with students, teachers, and evaluators in the course are presented in this article. By identifying a total of twenty-one deficiencies from different perspectives, we found that there exist critical deficiencies at different levels within the course. Furthermore, in order to overcome the identified deficiencies, we suggest seven recommendations that may be implemented at different levels within the course and the study program. Our suggested recommendations, if implemented, will help in resolving deficiencies in the course, which may lead to achieving an improved teaching and learning in the RM course at BTH. 

  • 111.
    Bakhtyar, Shoaib
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Henesey, Lawrence
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Electronic Waybill Solutions: A Systemtic ReviewIn: Journal of Special Topics in Information Technology and Management, ISSN 1385-951X, E-ISSN 1573-7667Article in journal (Other academic)
    Abstract [en]

    A critical component in freight transportation is the waybill, which is a transport document that has essential information about a consignment. Actors within the supply chain handle not only the freight but also vast amounts of information,which are often unclear due to various errors. An electronic waybill (e-Waybill) solution is an electronic replacement of the paper waybill in a better way, e.g., by ensuring error free storage and flow of information. In this paper, a systematic review using the snowball method is conducted to investigate the state-of-the-art of e-Waybill solutions. After performing three iterations of the snowball process,we identified eleven studies for further evaluation and analysis due to their strong relevancy. The studies are mapped in relation to each other and a classification of the e-Waybill solutions is constructed. Most of the studies identified from our review support the benefits of electronic documents including e-Waybills. Typically, most research papers reviewed support EDI (Electronic Documents Interchange) for implementing e-Waybills. However, limitations exist due to high costs that make it less affordable for small organizations. Recent studies point to alternative technologies that we have listed in this paper. Additionally in this paper, we present from our research that most studies focus on the administrative benefits, but few studies investigate the potential of e-Waybill information for achieving services, such as estimated time of arrival and real-time tracking and tracing.

  • 112.
    Bakhtyar, Shoaib
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Henesey, Lawrence
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Freight transport prediction using electronic waybills and machine learning2014In: 2014 International Conference on Informative and Cybernetics for Computational Social Systems, IEEE Computer Society, 2014, p. 128-133Conference paper (Refereed)
    Abstract [en]

    A waybill is a document that accompanies the freight during transportation. The document contains essential information such as, origin and destination of the freight, involved actors, and the type of freight being transported. We believe, the information from a waybill, when presented in an electronic format, can be utilized for building knowledge about the freight movement. The knowledge may be helpful for decision makers, e.g., freight transport companies and public authorities. In this paper, the results from a study of a Swedish transport company are presented using order data from a customer ordering database, which is, to a larger extent, similar to the information present in paper waybills. We have used the order data for predicting the type of freight moving between a particular origin and destination. Additionally, we have evaluated a number of different machine learning algorithms based on their prediction performances. The evaluation was based on their weighted average true-positive and false-positive rate, weighted average area under the curve, and weighted average recall values. We conclude, from the results, that the data from a waybill, when available in an electronic format, can be used to improve knowledge about freight transport. Additionally, we conclude that among the algorithms IBk, SMO, and LMT, IBk performed better by predicting the highest number of classes with higher weighted average values for true-positive and false-positive, and recall.

  • 113.
    Bakhtyar, Shoaib
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Holmgren, Johan
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    A Data Mining Based Method for Route and Freight Estimation2015In: Procedia Computer Science, Elsevier, 2015, Vol. 52, p. 396-403Conference paper (Refereed)
    Abstract [en]

    We present a method, which makes use of historical vehicle data and current vehicle observations in order to estimate 1) the route a vehicle has used and 2) the freight the vehicle carried along the estimated route. The method includes a learning phase and an estimation phase. In the learning phase, historical data about the movement of a vehicle and of the consignments allocated to the vehicle are used in order to build estimation models: one for route choice and one for freight allocation. In the estimation phase, the generated estimation models are used together with a sequence of observed positions for the vehicle as input in order to generate route and freight estimates. We have partly evaluated our method in an experimental study involving a medium-size Swedish transport operator. The results of the study indicate that supervised learning, in particular the algorithm Naive Bayes Multinomial Updatable, shows good route estimation performance even when significant amount of information about where the vehicle has traveled is missing. For the freight estimation, we used a method based on averaging the consignments on the historical known trips for the estimated route. We argue that the proposed method might contribute to building improved knowledge, e.g., in national road administrations, on the movement of trucks and freight.

  • 114.
    Bakhtyar, Shoaib
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Holmgren, Johan
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Persson, Jan A.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Technical Requirements of the e-Waybill Service2016In: International Journal of Computer and Communication Engineering, ISSN 2010-3743, ISSN 2010-3743, Vol. 5, no 2, p. 130-140Article in journal (Refereed)
    Abstract [en]

    An electronic waybill (e-Waybill) is a service whose purpose is to replace the paper waybill, which is a paper documents that traditionally follows a consignment during transport. An important purpose of the e-Waybill is to achieve a paperless flow of information during freight transport. In this paper, we investigate five e-Waybill solutions, that is, system design specifications for the e-Waybill, regarding their non-functional (technical) requirements. In addition, we discuss how well existing technologies are able to fulfil the identified requirements. We have identified that information storage, synchronization and conflict management, access control, and communication are important categories of technical requirements of the e-Waybill service. We argue that the identified technical requirements can be used to support the process of designing and implementing the e-Waybill service.

  • 115.
    Bakhtyar, Shoaib
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Mbiydzenyuy, Gideon
    Netport Science Park, Karlshamn.
    Henesey, Lawrence
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    A Simulation Study of the Electronic Waybill Service2015In: Proceedings - EMS 2015: UKSim-AMSS 9th IEEE European Modelling Symposium on Computer Modelling and Simulation / [ed] David Al-Dabas, Gregorio Romero, Alessandra Orsoni, Athanasios Pantelous, IEEE Computer Society, 2015, p. 307-312Conference paper (Refereed)
    Abstract [en]

    We present results from a simulation study, whichwas designed for investigating the potential positive impacts, i.e., the invoicing and processing time, and financial savings,when using an electronic waybill instead of paper waybillsfor road-based freight transportation. The simulation modelis implemented in an experiment for three different scenarios,where the processing time for waybills at the freight loadingand unloading locations in a particular scenario differs fromother scenarios. The results indicate that a saving of 65%–99%in the invoicing time can be achieved when using an electronicwaybill instead of paper waybills. Our study can be helpful todecision makers, e.g., managers and staff dealing with paperwaybills, to estimate the potential benefits when making deci-sions concerning the implementation of an electronic waybillsolution for replacing paper waybills.

  • 116.
    Bala, Jaswanth
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Filtering estimated series of residential burglaries using spatio-temporal route calculations2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context. According to Swedish National Council for Crime Prevention, there is an increase of 19% in residential burglary crimes in Sweden over the last decade and only 5% of the total crimes reported were actually solved by the law enforcement agencies. In order to solve these cases quickly and efficiently, the law enforcement agencies has to look into the possible linked serial crimes. Many studies have suggested to link crimes based on Modus Operendi and other characteristic. Sometimes crimes which are not possible to travel spatially with in the reported times but have similar Modus Operendi are also grouped as linked crimes. Investigating such crimes could possibly waste the resources of the law enforcement agencies.

    Objectives. In this study, we investigate the possibility of the usage of travel distance and travel duration between different crime locations while linking the residential burglary crimes. A filtering method has been designed and implemented for filtering the unlinked crimes from the estimated linked crimes by utilizing the distance and duration values.

    Methods. The objectives in this study are satisfied by conducting an experiment. The travel distance and travel duration values are obtained from various online direction services. The filtering method was first validated on ground truth represented by known linked crime series and then it was used to filter out crimes from the estimated linked crimes.

    Results. The filtering method had removed a total of 4% unlinked crimes from the estimated linked crime series when the travel mode is considered as driving. Whereas it had removed a total of 23% unlinked crimes from the estimated linked crime series when the travel mode is considered as walking. Also it was found that a burglar can take an average of 900 seconds (15 minutes) for committing a burglary.

    Conclusions. From this study it is evident that the usage of spatial and temporal values in linking residential burglaries gives effective crime links in a series. Also, the usage of Google Maps for getting distance and duration values can increase the overall performance of the filtering method in linking crimes.

  • 117.
    Ballesteros, Luis Guillermo Martinez
    et al.
    KTH Royal Inst Technol, Radio Syst Lab RSLab, Stockholm, Sweden..
    Ickin, Selim
    Ericsson Res, Stockholm, Sweden..
    Fiedler, Markus
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    Markendahl, Jan
    KTH Royal Inst Technol, Radio Syst Lab RSLab, Stockholm, Sweden..
    Tollmar, Konrad
    KTH Royal Inst Technol, Radio Syst Lab RSLab, Stockholm, Sweden..
    Wac, Katarzyna
    Univ Copenhagen, DK-1168 Copenhagen, Denmark..
    Energy Saving Approaches for Video Streaming on Smartphone based on QoE Modeling2016In: 2016 13TH IEEE ANNUAL CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE (CCNC), IEEE Communications Society, 2016Conference paper (Refereed)
    Abstract [en]

    In this paper, we study the influence of video stalling on QoE. We provide QoE models that are obtained in realistic scenarios on the smartphone, and provide energy-saving approaches for smartphone by leveraging the proposed QoE models in relation to energy. Results show that approximately 5J is saved in a 3 minutes video clip with an acceptable Mean Opinion Score (MOS) level when the video frames are skipped. If the video frames are not skipped, then it is suggested to avoid freezes during a video stream as the freezes highly increase the energy waste on the smartphones.

  • 118.
    Barry, Cecilia
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    ROOTS: What could emerge out of thinking and acting networked roots as design?2017Independent thesis Basic level (degree of Bachelor), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    ROOTS – What Could Emerge Out of Thinking and Acting Networked ROOTS as Design?

     

    This bachelor’s thesis uses ROOTS as a method designed to engage in both thinking and acting inside networks, by creating a hydroponic gardening network. As a designer one engages in many different fields of design. The most complicated design is designing networks with function, interlaced and embedded in everyday life. This is known as accountability, to be accountable to ones decisions and to act on many perspectives when designing. Accountability is designing from somewhere, and being aware of where that somewhere stems from. ROOTS visualizes accountability in a network, as accountability entails thinking and acting inside a network, and by doing so one actively engages in thinking about futures and design as a whole. When asking oneself what could emerge out of thinking and acting networked ROOTS as design, one begins to speculate in matters of vast networked complexity. From observation using methods such as ANT, the technologic extension T-ANT and also conducting a study in messiness, information is created and from the information, valuing becomes present, from valuing knowledge grows, from knowledge comes accountability and the network creates another cycle of ROOTS.

     

    Keywords: Design, Network, Accountability, Complexity

  • 119.
    Barysau, Mikalai
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Developers' performance analysis based on code review data: How to perform comparisons of different groups of developers2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Nowadays more and more IT companies switch to the distributed development model. This trend has a number of advantages and disadvantages, which are studied by researchers through different aspects of the modern code development. One of such aspects is code review, which is used by many companies and produces a big amount of data. A number of studies describe different data mining and data analysis approaches, which are based on a link between code review data and performance. According to these studies analysis of the code review data can give a good insight to the development performance and help software companies to detect a number of performance issues and improve the quality of their code.

    The main goal of this Thesis was to collect reported knowledge about the code review data analysis and implement a solution, which will help to perform such analysis in a real industrial setting.

    During the performance of the research the author used multiple research techniques, such as Snowballing literature review, Case study and Semi-structured interviews.

    The results of the research contain a list of code review data metrics, extracted from the literature and a software tool for collecting and visualizing data.

    The performed literature review showed that among the literature sources, related to the code review, relatively small amount of sources are related to the topic of the Thesis, which exposes a field for a future research. Application of the found metrics showed that most of the found metrics are possible to use in the context of the studied environment. Presentation of the results and interviews with company's representatives showed that the graphic plots are useful for observing trends and correlations in development of company's development sites and help the company to improve its performance and decision making process.

  • 120.
    Baskaravel, Yogaraj
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Implementation and evaluation of global router for Information-Centric Networking2014Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context. A huge majority of the current Internet traffic is information dissemination. Information-Centric Networking (ICN) is a future networking paradigm that focuses on global level information dissemination. In ICN, the communication is defined in terms of requesting and providing Named Data Objects (NDO). NetInf is a future networking architecture based on Information-Centric Networking principles. Objectives. In this thesis, a global routing solution for ICN has been implemented. The authority part of NDO's name is mapped to a set of routing hints each with a priority value. Multiple NDOs can share the same authority part and thus the first level aggregation is provided. The routing hints are used to forward a request for a NDO towards a suitable copy of the NDO. The second level aggregation is achieved by aggregating high priority routing hints on low priority routing hints. The performance and scalability of the routing implementation are evaluated with respect to global ICN requirements. Furthermore, some of the notable challenges in implementing global ICN routing are identified. Methods. The NetInf global routing solution is implemented by extending NEC's NetInf Router Platform (NNRP). A NetInf testbed is built over the Internet using the extended NNRP implementation. Performance measurements have been taken from the NetInf testbed. The performance measurements have been discussed in detail in terms of routing scalability. Results. The performance measurements show that hop-by-hop transport has significant impact on the overall request forwarding. A notable amount of time is taken for extracting and inserting binary objects such as routing hints at each router. Conclusions. A more suitable hop-by-hop transport mechanism can be evaluated and used with respect to global ICN requirements. The NetInf message structure can be redefined so that binary objects such as routing hints can be transmitted more efficiently. Apart from that, the performance of the global routing implementation appears to be reasonable. As the NetInf global routing solution provides two levels of aggregation, it can be scalable as well.

  • 121.
    Beer, Armin
    et al.
    BVA and Beer Test Consulting, AUT.
    Felderer, Michael
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Measuring and improving testability of system requirements in an industrial context by applying the goal question metric approach2018In: Proceedings - International Conference on Software Engineering, IEEE Computer Society , 2018, p. 25-32Conference paper (Refereed)
    Abstract [en]

    Testing is subject to two basic constraints, namely cost and quality. The cost depends on the efficiency of the testing activities as well as their quality and testability. The author's practical experience in large-scale systems shows that if the requirements are adapted iteratively or the architecture is altered, testability decreases. However, what is often lacking is a root cause analysis of the testability degradations and the introduction of improvement measures during software development. In order to introduce agile practices in the rigid strategy of the V-model, good testability of software artifacts is vital. So testability is also the bridgehead towards agility. In this paper, we report on a case study in which we measure and improve testability on the basis of the Goal Question Metric Approach. © 2018 ACM.

  • 122.
    Begnert, Joel
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tilljander, Rasmus
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Combining Regional Time Stepping With Two-Scale PCISPH Method2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context. In computer graphics, realistic looking fluid is often desired. Simulating realistic fluids is a time consuming and computationally expensive task, therefore, much research has been devoted to reducing the simulation time while maintaining the realism. Two of the more recent optimization algorithms within particle based simulations are two-scale simulation and regional time stepping (RTS). Both of them are based on the predictive-corrective incompressible smoothed particle hydrodynamics (PCISPH) algorithm.

    Objectives. These algorithms improve on two separate aspects of PCISPH, two-scale simulation reduces the number of particles and RTS focuses computational power on regions of the fluid where it is most needed. In this paper we have developed and investigated the performance of an algorithm combining them, utilizing both optimizations.

    Methods. We implemented both of the base algorithms, as well as PCISPH, before combining them. Therefore we had equal conditions for all algorithms when we performed our experiments, which consisted of measuring the time it took to run each algorithm in three different scene configurations.

    Results. Results showed that our combined algorithm on average was faster than the other three algorithms. However, our implementation of two-scale simulation gave results inconsistent with the original paper, showing a slower time than even PCISPH. This invalidates the results for our combined algorithm since it utilizes the same implementation.

    Conclusions. We see that our combined algorithm has potential to speed up fluid simulations, but since the two-scale implementation was incorrect, our results are inconclusive.

  • 123.
    Bengtsson, Daniel
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Melin, Johan
    Constrained procedural floor plan generation for game environments2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Background: Procedural content generation (PCG) has become an important subject as the demand for content in modern games has increased. Paradox Arctic is a game development studio that aims to be at the forefront of technological solutions and is therefore interested in furthering their knowledge in PCG. To this end, Paradox Arctic has expressed their interest in a collaborative effort to further explore the subject of procedural floor plan generation.

    Objective: The main goal of this work is to test whether a solution based on growth, subdivision or a combination thereof, can be used to procedurally generate believable and varied floor plans for game environments, while also conforming to predefined constraints.

    Method: A solution capable of generating floor plans with the use of growth, subdivision and a combination of both has been implemented and a survey testing the believability and variation of the generated layouts has been conducted.

    Results & Conclusions: While the results of the subdivision and combined solutions show that more work is necessary before the generated content can be considered believable, the growth based solution presents promising results in terms of believability when generating smaller to medium sized layouts. This believability does however come at the cost of variation.

  • 124.
    Bengtsson, Filip
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Undin, Philip
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Moderna progression system, en kooperativ regress2015Independent thesis Basic level (degree of Bachelor), 20 credits / 30 HE creditsStudent thesis
    Abstract [sv]

    Digitala spel är ett av dagens största digitala medier där det dagligen släpps nya spel. En av de största spel-genrerna är multiplayer First-Person Shooter (FPS) spel. Inom multiplayer FPS-spel, där kooperativt lagsamarbete spelar stor roll, har det på senare år uppkommit en trend där de flesta av dessa spel innehåller progression system som kan vara direkt skadande för spelets kooperativa upplevelse. Genom att undersöka frågeställningen “Hur kan vi med hjälp av progression system göra moderna FPS-spel mer kooperativa?” har vi försökt ta fram hur spelutvecklare istället kan öka den kooperativa upplevelsen i sina spel. Genom att ha undersökt och diskuterat områden som system, mänsklig motivation samt kooperativ spelteori har vi fått fram belöningspåverkan av en spelares beteende. Med hjälp av vår tidigare forskning har vi grundligt analyserat och brutit ner två av marknadens största titlar inom multiplayer FPS-spel, vi kom fram till att med hjälp av ett genomtänkt progression system kan bidra med ökad fokus på den kooperativa aspekten av dessa spel.

  • 125.
    Berg, Wilhelm
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Terränggenerering och dess påverkan på spelupplevelse2015Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Kontext. Inom speldesign är terräng ofta en viktig aspekt, särskilt i sammanhanget med spelare som ska aktivt interagera med terräng. Dess utformning och design kan både positivt och negativt påverka hur spelaren uppfattar spelet.

    Mål. I detta arbete beskrivs ett arbete i terränggenerering och om terräng kan påverka spelaren i ett interaktivt media för att få bättre förståelse inom ämnet. Har terrängen en påverkan på hur spelaren uppfattar situationer i spel samt deras sätt att spela? Kan den påverka om de uppfattar upplevelsen som negativ eller positiv? Vad är mest påverkande för en spelare och hur?

    Metoder. I arbetet kommer slutsatser och arbetssätt beskrivas tillsammans med data insamlad från ett praktiskt test. Designen för det spel som används för att testa kommer även att beskrivas. I arbetets testande låter vi deltagare spela ett spel som använder sig av en algoritm för att skapa terräng. Efter testet kommer spelare svara på frågor om testet.

    Resultat. Från testandet får vi in svar som används för att nå vissa slutsatser.

    Slutsatser. Från testets resultat kommer vi dra slutsatsen att terräng verkligen kan ha en påverkan på spelarupplevelsen och att terräng kan påverka när den har störst aktivt inverkan på hur spelaren interagerar med spelet.

  • 126.
    Bergman, Martin
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Jönsson, André
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Physically based rendering: Ur en 3D-grafikers perspektiv2014Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Det här kandidatarbetet undersöker hur Physically based rendering kan påverkar en 3D-grafikers arbete. Målet är att skapa förståelse kring physically based rendering och hur denna tekniken kan komma att påverkar en 3D-grafikers arbete. För att undersöka problemområdet skapades en virtuell miljö i 3D med hjälp av physically based rendering. Arbetsupplevelsen jämfördes senare med det tidigare arbetssättet. Undersökningen beskriver tidigare arbetsätt och hur arbetet har ändrats med physically based rendering. Undersökningen går även igenom fördelar, nackdelar och konsekvenser med att arbeta med Physically based rendering. This bachelor thesis studies how Physically based rendering can affect a 3D artists work. The aim is to create an understanding of Physically Based Rendering and how this technique might affect a 3D artists work. To examine the problem area we created a virtual environment in 3D using Physically Based Rendering. The new workflow was then compared with the former workflow. The study describes the former workflow and how the work has changed with Physically Based Rendering. This thesis also covers the pros, cons and implications of working with Physically Based Rendering.

  • 127.
    Bergsten, John
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Öhman, Konrad
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Player Analysis in Computer Games Using Artificial Neural Networks2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Star Vault AB is a video game development company that has developed the video game Mortal Online. The company has stated that they believe that players new to the game repeatedly find themselves being lost in the game. The objective of this study is to evaluate whether or not an Artificial Neural Network can be used to evaluate when a player is lost in the game Mortal Online. This is done using the free open source library Fast Artifical Neural Network Library. People are invited to a data collection event where they play a tweaked version of the game to facilitate data collection. Players specify whether they are lost or not and the data collected is flagged accordingly. The collected data is then prepared with different parameters to be used when training multiple Artificial Neural Networks. When creating an Artificial Neural Network there exists several parameters which have an impact on its performance. Performance is defined as the balance of high prediction accuracy against low false positive rate. These parameters vary depending on the purpose of the Artificial Neural Network. A quantitative approach is followed where these parameters are varied to investigate which values result in the Artificial Neural Network which best identifies when a player is lost. The parameters are grouped into stages where all combinations of parameter values within each stage are evaluated to reduce the amount of Artificial Neural Networks which have to be trained, with the best performing parameters of each stage being used in subsequent stages. The result is a set of values for the parameters that are considered as ideal as possible. These parameter values are then altered one at a time to verify that they are ideal. The results show that a set of parameters exist which can optimize the Artificial Neural Network model to identify when a player is lost, however not with the high performance that was hoped for. It is theorized that the ambiguity of the word "lost" and the complexity of the game are critical to the low performance.

  • 128.
    Berntsson, Fredrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Schengensamarbetet – Europas dröm2014Student thesis
    Abstract [sv]

    Denna uppsats klargör vad Schengensamarbetet är för något, varför det finns och hur det fungerar. Uppsatsen går igenom alla delar av samarbetet som till synes största del består av att avskaffa personkontrollerna mellan medlemsländerna.

  • 129.
    Berntsson Svensson, Richard
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Regnell, Björn
    Lunds universitet, SWE.
    Is role playing in Requirements Engineering Education increasing learning outcome?2017In: Requirements Engineering, ISSN 0947-3602, E-ISSN 1432-010X, Vol. 22, no 4, p. 475-489Article in journal (Refereed)
    Abstract [en]

    Requirements Engineering has attracted a great deal of attention from researchers and practitioners in recent years. This increasing interest requires academia to provide students with a solid foundation in the subject matter. In Requirements Engineering Education (REE), it is important to cover three fundamental topics: traditional analysis and modeling skills, interviewing skills for requirements elicitation, and writing skills for specifying requirements. REE papers report about using role playing as a pedagogical tool; however, there is a surprising lack of empirical evidence on its utility. In this paper we investigate whether a higher grade in a role playing project have an effect on students’ score in an individual written exam in a Requirements Engineering course. Data are collected from 412 students between the years of 2007 and 2014 at Lund University and Chalmers | University of Gothenburg. The results show that students who received a higher grade in the role playing project scored statistically significant higher in the written exam compared to the students with a lower role playing project grade. © 2016 Springer-Verlag London

  • 130.
    Berntsson Svensson, Richard
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Taghavianfar, Maryam
    Selecting creativity techniques for creative requirements: An evaluation of four techniques using creativity workshops2015In: 2015 IEEE 23RD INTERNATIONAL REQUIREMENTS ENGINEERING CONFERENCE (RE), IEEE, 2015, p. 66-75Conference paper (Refereed)
    Abstract [en]

    Requirements engineering is recognized as a creative process where stakeholders jointly discover new creative ideas for innovative and novel products that eventually are expressed as requirements. This paper evaluates four different creativity techniques, namely Hall of Fame, Constraint Removal, Brainstorming, and Idea Box, using creativity workshops with students and industry practitioners. In total, 34 creativity workshops were conducted with 90 students from two universities, and 86 industrial practitioners from six companies. The results from this study indicate that Brainstorming can generate by far the most ideas, while Hall of Fame generates most creative ideas. Idea Box generates the least number of ideas, and the least number of creative ideas. Finally, Hall of Fame was the technique that led to the most number of requirements that was included in future releases of the products.

  • 131.
    Bertoni, Alessandro
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Dasari, Siva Krishna
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Hallstedt, Sophie
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    Petter, Andersson
    GKN Aerospace Systems , SWE.
    Model-based decision support for value and sustainability assessment: Applying machine learning in aerospace product development2018In: DS92: Proceedings of the DESIGN 2018 15th International Design Conference / [ed] Marjanović D., Štorga M., Škec S., Bojčetić N., Pavković N, The Design Society, 2018, Vol. 6, p. 2585-2596Conference paper (Refereed)
    Abstract [en]

    This paper presents a prescriptive approach toward the integration of value and sustainability models in an automated decision support environment enabled by machine learning (ML). The approach allows the concurrent multidimensional analysis of design cases complementing mechanical simulation results with value and sustainability assessment. ML allows to deal with both qualitative and quantitative data and to create surrogate models for quicker design space exploration. The approach has been developed and preliminary implemented in collaboration with a major aerospace sub-system manufacturer.

  • 132.
    Betz, Stefanie
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Andreas, Oberweis
    Rolf, Stephan
    Knowledge transfer in offshore outsourcing software development projects: an analysis of the challenges and solutions from German clients2014In: Expert systems (Print), ISSN 0266-4720, E-ISSN 1468-0394, Vol. 31, no 3Article in journal (Refereed)
    Abstract [en]

    Knowledge transfer is a critical factor in ensuring the success of offshore outsourcing software development projects and is, in many cases, neglected. Compared to in-house or co-located projects, however, such globally distributed projects feature far greater complexity. In addition to language barriers, factors such as cultural differences, time zone variance, distinct methods and practices, as well as unique equipment and infrastructure can all lead to problems that negatively impact knowledge transfer, and as a result, a project's overall success. In order to help minimise such risks to knowledge transfer, we conducted a research study based on expert interviews in six projects. Our study used German clients and focused on offshore outsourcing software development projects. We first identified known problems in knowledge transfer that can occur with offshore outsourcing projects. Then we collected best-practice solutions proven to overcome the types of problems described. Afterward, we conducted a follow-up study to evaluate our findings. In this subsequent stage, we presented our findings to a different group of experts in five projects and asked them to evaluate these solutions and recommendations in terms of our original goal, namely to find ways to minimise knowledge-transfer problems in offshore outsourcing software development projects. Thus, the result of our study is a catalog of evaluated solutions and associated recommendations mapped to the identified problem areas.

  • 133. Beyene, Ayne A.
    et al.
    Welemariam, Tewelle
    Persson, Marie
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Lavesson, Niklas
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Improved concept drift handling in surgery prediction and other applications2015In: Knowledge and Information Systems, ISSN 0219-1377, Vol. 44, no 1, p. 177-196Article in journal (Refereed)
    Abstract [en]

    The article presents a new algorithm for handling concept drift: the Trigger-based Ensemble (TBE) is designed to handle concept drift in surgery prediction but it is shown to perform well for other classification problems as well. At the primary care, queries about the need for surgical treatment are referred to a surgeon specialist. At the secondary care, referrals are reviewed by a team of specialists. The possible outcomes of this review are that the referral: (i) is canceled, (ii) needs to be complemented, or (iii) is predicted to lead to surgery. In the third case, the referred patient is scheduled for an appointment with a surgeon specialist. This article focuses on the binary prediction of case three (surgery prediction). The guidelines for the referral and the review of the referral are changed due to, e.g., scientific developments and clinical practices. Existing decision support is based on the expert systems approach, which usually requires manual updates when changes in clinical practice occur. In order to automatically revise decision rules, the occurrence of concept drift (CD) must be detected and handled. The existing CD handling techniques are often specialized; it is challenging to develop a more generic technique that performs well regardless of CD type. Experiments are conducted to measure the impact of CD on prediction performance and to reduce CD impact. The experiments evaluate and compare TBE to three existing CD handling methods (AWE, Active Classifier, and Learn++) on one real-world dataset and one artificial dataset. TBA significantly outperforms the other algorithms on both datasets but is less accurate on noisy synthetic variations of the real-world dataset.

  • 134.
    Bihl, Erik
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    The captivating use of silence in film: How silence affects the emotional aspect of cinema2017Independent thesis Basic level (degree of Bachelor), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    In this thesis I use both Dense Clarity - Clear Density as well as qualitative interviewing as methods to guide me through this examination of sound design. Through studying other works and executing personal tests I try to find out if there is a need to use sound and silence in a creative way to evoke emotion. I examine films as well as literature from the 1960s all the way to the 2000s, to see how the use of silence has unfolded over the years. I also create a visual production that strengthens my theory that silence affects narrative more than its credited for. But the essay isn’t just about silence, it’s revolved around sound too, expanding into how sound correlates with emotion and how one can apply it to their production.  

  • 135.
    Bilski, Mateusz
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Migration from blocking to non-blocking web frameworks2014Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The problem of performance and scalability of web applications is challenged by most of the software companies. It is difficult to maintain the performance of a web application while the number of users is continuously increasing. The common solution for this problem is scalability. A web application can handle incoming and outgoing requests using blocking or non-blocking Input/Output operation. The way that a single server handles requests affects its ability to scale and depends on a web framework that was used to build the web application. It is especially important for Resource Oriented Architecture (ROA) based applications which consist of distributed Representational State Transfer (REST) web services. This research was inspired by a real problem stated by a software company that was considering the migration to the non-blocking web framework but did not know the possible profits. The objective of the research was to evaluate the influence of web framework's type on the performance of ROA based applications and to provide guidelines for assessing profits of migration from blocking to non-blocking JVM web frameworks. First, internet ranking was used to obtain the list of the most popular web frameworks. Then, the web frameworks were used to conduct two experiments that investigated the influence of web framework's type on the performance of ROA based applications. Next, the consultations with software architects were arranged in order to find a method for approximating the performance of overall application. Finally, the guidelines were prepared based on the consultations and the results of the experiments. Three blocking and non-blocking highly ranked and JVM based web frameworks were selected. The first experiment showed that the non-blocking web frameworks can provide performance up to 2.5 times higher than blocking web frameworks in ROA based applications. The experiment performed on existing application showed average 27\% performance improvement after the migration. The elaborated guidelines successfully convinced the company that provided the application for testing to conduct the migration on the production environment. The experiment results proved that the migration from blocking to non-blocking web frameworks increases the performance of web application. The prepared guidelines can help software architects to decide if it is worth to migrate. However the guidelines are context depended and further investigation is needed to make it more general.

  • 136.
    bin Ali, Nauman
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Operationalization of lean thinking through value stream mapping with simulation and FLOW2015Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Background: The continued success of Lean thinking beyond manufacturing has led to an increasing interest to utilize it in software engineering (SE). Value Stream Mapping (VSM) had a pivotal role in the operationalization of Lean thinking. However, this has not been recognized in SE adaptations of Lean. Furthermore, there are two main shortcomings in existing adaptations of VSM for an SE context. First, the assessments for the potential of the proposed improvements are based on idealistic assertions. Second, the current VSM notation and methodology are unable to capture the myriad of significant information flows, which in software development go beyond just the schedule information about the flow of a software artifact through a process. Objective: This thesis seeks to assess Software Process Simulation Modeling (SPSM) as a solution to the first shortcoming of VSM. In this regard, guidelines to perform simulation-based studies in industry are consolidated, and the usefulness of VSM supported with SPSM is evaluated. To overcome the second shortcoming of VSM, a suitable approach for capturing rich information flows in software development is identified and its usefulness to support VSM is evaluated. Overall, an attempt is made to supplement existing guidelines for conducting VSM to overcome its known shortcomings and support adoption of Lean thinking in SE. The usefulness and scalability of these proposals is evaluated in an industrial setting. Method: Three literature reviews, one systematic literature review, four industrial case studies, and a case study in an academic context were conducted as part of this research. Results: Little evidence to substantiate the claims of the usefulness of SPSM was found. Hence, prior to combining it with VSM, we consolidated the guidelines to conduct an SPSM based study and evaluated the use of SPSM in academic and industrial contexts. In education, it was found to be a useful complement to other teaching methods, and in the industry, it triggered useful discussions and was used to challenge practitioners’ perceptions about the impact of existing challenges and proposed improvements. The combination of VSM with FLOW (a method and notation to capture information flows, since existing VSM adaptions for SE are insufficient for this purpose) was successful in identifying challenges and improvements related to information needs in the process. Both proposals to support VSM with simulation and FLOW led to identification of waste and improvements (which would not have been possible with conventional VSM), generated more insightful discussions and resulted in more realistic improvements. Conclusion: This thesis characterizes the context and shows how SPSM was beneficial both in the industrial and academic context. FLOW was found to be a scalable, lightweight supplement to strengthen the information flow analysis in VSM. Through successful industrial application and uptake, this thesis provides evidence of the usefulness of the proposed improvements to the VSM activities.

  • 137.
    Bin Ali, Nauman
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering. Blekinge Inst Technol, Karlskrona, Sweden..
    Petersen, Kai
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering. Blekinge Inst Technol, Karlskrona, Sweden..
    Nicolau de Franca, Breno Bernard
    Univ Fed Rio de Janeiro, ESE Grp, PESC COPPE, BR-68511 Rio De Janeiro, Brazil..
    Evaluation of simulation-assisted value stream mapping for software product development: Two industrial cases2015In: Information and Software Technology, ISSN 0950-5849, E-ISSN 1873-6025, Vol. 68, p. 45-61Article in journal (Refereed)
    Abstract [en]

    Context: Value stream mapping (VSM) as a tool for lean development has led to significant improvements in different industries. In a few studies, it has been successfully applied in a software engineering context. However, some shortcomings have been observed in particular failing to capture the dynamic nature of the software process to evaluate improvements i.e. such improvements and target values are based on idealistic situations. Objective: To overcome the shortcomings of VSM by combining it with software process simulation modeling, and to provide reflections on the process of conducting VSM with simulation. Method: Using case study research, VSM was used for two products at Ericsson AB, Sweden. Ten workshops were conducted in this regard. Simulation in this study was used as a tool to support discussions instead of as a prediction tool. The results have been evaluated from the perspective of the participating practitioners, an external observer, and reflections of the researchers conducting the simulation that was elicited by the external observer. Results: Significant constraints hindering the product development from reaching the stated improvement goals for shorter lead time were identified. The use of simulation was particularly helpful in having more insightful discussions and to challenge assumptions about the likely impact of improvements. However, simulation results alone were found insufficient to emphasize the importance of reducing waiting times and variations in the process. Conclusion: The framework to assist VSM with simulation presented in this study was successfully applied in two cases. The involvement of various stakeholders, consensus building steps, emphasis on flow (through waiting time and variance analysis) and the use of simulation proposed in the framework led to realistic improvements with a high likelihood of implementation. (C) 2015 Elsevier B.V. All rights reserved.

  • 138.
    bin Ali, Nauman
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Petersen, Kai
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    A systematic literature review on the industrial use of software process simulation2014In: Journal of Systems and Software, ISSN 0164-1212, Vol. 97Article in journal (Refereed)
    Abstract [en]

    Context Software process simulation modelling (SPSM) captures the dynamic behaviour and uncertainty in the software process. Existing literature has conflicting claims about its practical usefulness: SPSM is useful and has an industrial impact; SPSM is useful and has no industrial impact yet; SPSM is not useful and has little potential for industry. Objective To assess the conflicting standpoints on the usefulness of SPSM. Method A systematic literature review was performed to identify, assess and aggregate empirical evidence on the usefulness of SPSM. Results In the primary studies, to date, the persistent trend is that of proof-of-concept applications of software process simulation for various purposes (e.g. estimation, training, process improvement, etc.). They score poorly on the stated quality criteria. Also only a few studies report some initial evaluation of the simulation models for the intended purposes. Conclusion There is a lack of conclusive evidence to substantiate the claimed usefulness of SPSM for any of the intended purposes. A few studies that report the cost of applying simulation do not support the claim that it is an inexpensive method. Furthermore, there is a paramount need for improvement in conducting and reporting simulation studies with an emphasis on evaluation against the intended purpose.

  • 139.
    bin Ali, Nauman
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Unterkalmsteiner, Michael
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Use and evaluation of simulation for software process education: a case study2014Conference paper (Refereed)
    Abstract [en]

    Software Engineering is an applied discipline and concepts are difficult to grasp only at a theoretical level alone. In the context of a project management course, we introduced and evaluated the use of software process simulation (SPS) based games for improving students’ understanding of software development processes. The effects of the intervention were measured by evaluating the students’ arguments for choosing a particular development process. The arguments were assessed with the Evidence-Based Reasoning framework, which was extended to assess the strength of an argument. The results indicate that students generally have difficulty providing strong arguments for their choice of process models. Nevertheless, the assessment indicates that the intervention of the SPS game had a positive impact on the students’ arguments. Even though the illustrated argument assessment approach can be used to provide formative feedback to students, its use is rather costly and cannot be considered a replacement for traditional assessments.

  • 140.
    Birgersson, Frida
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Onomatopoesi i skönlitteratur: - gemensamma tolkningar av ljudhärmande ord2016Independent thesis Basic level (university diploma), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Previous and recent studies shows that onomatopoeia is presented in fictionstories but the studies lack valid information within the subject. Moredetailed research can be found within language studies. The purpose of thisstudy is to enlighten onomatopoeia, how it is presented in fictionproductions and if people share a mutual vision of sounds imitating wordsthat might be used in these productions. The study was conducted with aqualitative method and with a phenomenological perspective. 36 individualsparticipated in a questionnaire that was conducted. A number of sounds waspresented to the participants who then had to type the sound in text. BONKand SPLOOSH showed a mutual vision and the result was presented astables. A product was created to unite the previous and recent studies withthe result from the questionnaire. This resulted in an interactive book thatwith help of digital tool create an ordiginal product.

  • 141.
    Bjarnason, Elizabeth
    et al.
    Lund Univ, SWE.
    Morandini, Mirko
    Fdn Bruno Kessler, ITA.
    Borg, Markus
    Lund Univ, SWE.
    Unterkalmsteiner, Michael
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Felderer, Michael
    Univ Innsbruck, AUT.
    Staats, Matthew
    Google Inc, CHE.
    2nd International Workshop on Requirements Engineering and Testing (RET 2015)2015In: 2015 IEEE/ACM 37TH IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, VOL 2, IEEE , 2015, p. 997-998Conference paper (Refereed)
    Abstract [en]

    The RET (Requirements Engineering and Testing) workshop provides a meeting point for researchers and practitioners from the two separate fields of Requirements Engineering (RE) and Testing. The goal is to improve the connection and alignment of these two areas through an exchange of ideas, challenges, practices, experiences and results. The long term aim is to build a community and a body of knowledge within the intersection of RE and Testing. One of the main outputs of the 1st workshop was a collaboratively constructed map of the area of RET showing the topics relevant to RET for these. The 2nd workshop will continue in the same interactive vein and include a keynote, paper presentations with ample time for discussions, and a group exercise. For true impact and relevance this cross-cutting area requires contribution from both RE and Testing, and from both researchers and practitioners. For that reason we welcome a range of paper contributions from short experience papers to full research papers that both clearly cover connections between the two fields.

  • 142. Bjarnason, Elizabeth
    et al.
    Unterkalmsteiner, Michael
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Borg, Markus
    Engström, Emelie
    A Multi-Case Study of Agile Requirements Engineering and the Use of Test Cases as Requirements2016In: Information and Software Technology, ISSN 0950-5849, E-ISSN 1873-6025, Vol. 77, p. 61-79Article in journal (Refereed)
    Abstract [en]

    [Context] It is an enigma that agile projects can succeed ‘without requirements’ when weak requirementsengineering is a known cause for project failures. While agile development projects often manage well withoutextensive requirements test cases are commonly viewed as requirements and detailed requirements are documented astest cases.[Objective] We have investigated this agile practice of using test cases as requirements to understand how test casescan support the main requirements activities, and how this practice varies.[Method] We performed an iterative case study at three companies and collected data through 14 interviews and 2focus groups.[Results] The use of test cases as requirements poses both benefits and challenges when eliciting, validating,verifying, and managing requirements, and when used as a documented agreement. We have identified five variants ofthe test-cases-as-requirements practice, namely de facto, behaviour-driven, story-test driven, stand-alone strict andstand-alone manual for which the application of the practice varies concerning the time frame of requirementsdocumentation, the requirements format, the extent to which the test cases are a machine executable specification andthe use of tools which provide specific support for the practice of using test cases as requirements.[Conclusions] The findings provide empirical insight into how agile development projects manage andcommunicate requirements. The identified variants of the practice of using test cases as requirements can be used toperform in-depth investigations into agile requirements engineering. Practitioners can use the providedrecommendations as a guide in designing and improving their agile requirements practices based on projectcharacteristics such as number of stakeholders and rate of change.

  • 143.
    Bjarnason, Elizabeth
    et al.
    Lund University.
    Unterkalmsteiner, Michael
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Engström, Emelie
    Lund University.
    Borg, Markus
    Lund University.
    An Industrial Case Study on the Use of Test Cases as Requirements2015In: Lecture Notes in Business Information, Springer, 2015, p. 27-39Conference paper (Refereed)
    Abstract [en]

    It is a conundrum that agile projects can succeed 'without requirements' when weak requirements engineering is a known cause for project failures. While Agile development projects often manage well without extensive requirements documentation, test cases are commonly used as requirements. We have investigated this agile practice at three companies in order to understandhow test cases can fill the role of requirements. We performed a case study based on twelve interviews performed in a previous study.The findings include a range of benefits and challenges in using test cases for eliciting, validating, verifying, tracing and managing requirements. In addition, we identified three scenarios for applying the practice, namely as a mature practice, as a de facto practice and as part of an agile transition. The findings provide insights into how the role of requirements may be met in agile development including challenges to consider.

  • 144.
    Bjäreholt, Johan
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    RISC-V Compiler Performance:A Comparison between GCC and LLVM/clang2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    RISC-V is a new open-source instruction set architecture (ISA) that in De-cember 2016 manufactured its rst mass-produced processors. It focuses onboth eciency and performance and diers from other open-source architec-tures by not having a copyleft license permitting vendors to freely design,manufacture and sell RISC-V chips without any fees nor having to sharetheir modications on the reference implementations of the architecture.The goal of this thesis is to evaluate the performance of the GCC andLLVM/clang compilers support for the RISC-V target and their ability tooptimize for the architecture. The performance will be evaluated from ex-ecuting the CoreMark and Dhrystone benchmarks are both popular indus-try standard programs for evaluating performance on embedded processors.They will be run on both the GCC and LLVM/clang compilers on dierentoptimization levels and compared in performance per clock to the ARM archi-tecture which is mature yet rather similar to RISC-V. The compiler supportfor the RISC-V target is still in development and the focus of this thesis willbe the current performance dierences between the GCC and LLVM com-pilers on this architecture. The platform we will execute the benchmarks onwil be the Freedom E310 processor on the SiFive HiFive1 board for RISC-Vand a ARM Cortex-M4 processor by Freescale on the Teensy 3.6 board. TheFreedom E310 is almost identical to the reference Berkeley Rocket RISC-Vdesign and the ARM Coretex-M4 processor has a similar clock speed and isaimed at a similar target audience.The results presented that the -O2 and -O3 optimization levels on GCCfor RISC-V performed very well in comparison to our ARM reference. Onthe lower -O1 optimization level and -O0 which is no optimizations and -Oswhich is -O0 with optimizations for generating a smaller executable code sizeGCC performs much worse than ARM at 46% of the performance at -O1,8.2% at -Os and 9.3% at -O0 on the CoreMark benchmark with similar resultsin Dhrystone except on -O1 where it performed as well as ARM. When turn-ing o optimizations (-O0) GCC for RISC-V was 9.2% of the performanceon ARM in CoreMark and 11% in Dhrystone which was unexpected andneeds further investigation. LLVM/clang on the other hand crashed whentrying to compile our CoreMark benchmark and on Dhrystone the optimiza-tion options made a very minor impact on performance making it 6.0% theperformance of GCC on -O3 and 5.6% of the performance of ARM on -O3, soeven with optimizations it was still slower than GCC without optimizations.In conclusion the performance of RISC-V with the GCC compiler onthe higher optimization levels performs very well considering how young theRISC-V architecture is. It does seems like there could be room for improvement on the lower optimization levels however which in turn could also pos-sibly increase the performance of the higher optimization levels. With theLLVM/clang compiler on the other hand a lot of work needs to be done tomake it competetive in both performance and stability with the GCC com-piler and other architectures. Why the -O0 optimization is so considerablyslower on RISC-V than on ARM was also very unexpected and needs furtherinvestigation.

  • 145.
    Björklund, Johanna
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Thorburn, Kyle
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Miljö som berättar: En karaktärs berättelse genom Environmental Storytelling2015Independent thesis Basic level (degree of Bachelor), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This study aims to explore and gain new insight into how one could tell a game-character's story through Environmental Storytelling. By doing this we hope to make game characters feel more real and alive, while at the same time furthering the game’s narrative. Before delving into the creation and examining of our game, we establish that it’s really important let the players come to their own conclusion regarding perceived story, even if they are inaccurate. After having created a game in an attempt to answer our question at issue we let other people play the game. We believe this is important as we, the creators of the game, already know everything there is to know about the character in question. The research uses Actor Network Theory as its methodology, as we believe it can help us understand relationships to why something works or doesn’t work. This research concludes that it’s all the small details that adds up, enriching a character's story.

  • 146.
    Björneskog, Amanda
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Goniband Shoshtari, Nima
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Comparison of Security and Risk awareness between different age groups2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The Internet have become a 'necessity' in the everyday life of just below 50\% of the world population. With the growth of the Internet and it creating a great platform to help people and making life easier, it has also brought a lot of malicious situations. Now a days people hack or uses social engineering on other people for a living, scamming and fraud is part of their daily life. Therefore security awareness is truly important and sometimes vital.We wanted to look at the difference in security awareness depending on which year you were born, in relation to the IT-boom and growth of the Internet. Does it matter if you lived through the earlier stages of the Internet or not? We found that the security awareness did increase with age, but if it was due to the candidates growing up before or after the IT-boom or due to the fact that younger people tend to be more inattentive is hard to tell. Our result is that the age group, 16-19, were more prone to security risks, due to an indifferent mindset regarding their data and information.

  • 147.
    Bjöörn, Christopher
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Johnsson, Jacob
    Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
    Universe-defining rules2014Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Abstrakt I detta arbete undersöks hur konceptet lek går att applicera på digitala spel och hur man presenterar ett fiktivt universum och de regler som definierar det universumet. Syftet med denna undersökning är att öka kvaliteten på digitala spel för spelare genom att öka förståelsen för hur sådana regler introduceras. Frågeställningen som ämnas att besvaras är “hur kan man introducera realistiska, semi-realistiska och fiktiva regler i ett spel?”. Undersökningen baseras delvis på analyser kring varför vissa introduktioner av regler ofta accepteras och andra inte, dels på utvärdering av en gestaltning och dels på tidigare forskning. Denna undersökning är indelad i två delar; en researchdel och en produktionsdel. För att besvara frågan har research skett kring vad som känns till sedan tidigare och ett digitalt spel har producerats där den stora regeln som skiljer verkligheten från detta fiktiva universum är paranormal aktivitet, eller spöken. Nyckelord: regler, magisk cirkel, inlevelse och spelproduktion. Abstract In this work the concept of play and how it may be applied to digital games and how to introduce a fictional universe and the rules that define that universe is being investigated. The purpose of this work is to increase the quality of digital games by increasing our understanding of how such rules may be introduced. The question to be answered is “how may realistic, semi-realistic and fictional rules be introduced in a digital game?”. This work is based partly on analyses on why some introductions of rules are often accepted and some often not, partly on evaluation of a product created by us and partly on earlier research. This work is split into two parts; one research part and one production part. To answer the question research about what is previously known has been conducted and a digital game has been produced where the main rule that separates the fictional universe from ours is paranormal activity, or ghosts. Keywords: Rules, magical circle, immersion and game production.

  • 148.
    Blal, Redouane
    et al.
    Universite du Quebec a Montreal, CAN.
    Leshob, Abderrahmane
    Universite du Quebec a Montreal, CAN.
    Gonzalez-Huerta, Javier
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Mili, Hafedh
    Universite du Quebec a Montreal, CAN.
    Boubaker, Anis
    Universite du Quebec a Montreal, CAN.
    From inter-organizational business process models to service-oriented architecture models2018In: Service Oriented Computing and Applications, ISSN 1863-2386, E-ISSN 1863-2394, Vol. 12, no 3-4, p. 227-245Article in journal (Refereed)
    Abstract [en]

    Today’s business processes become increasingly complex and often cross the boundaries of the organizations. On the one hand, to support their business processes, modern organizations use enterprise information systems that need to be aware of the organizations’ processes and contexts. Such systems are called Process-Aware Information System (PAIS). On the other hand, the service-oriented architecture (SOA) is a fast emerging architectural style that has been widely adopted by modern organizations to design and implement PAIS that support their business processes. This paper aims to bridge the gap between inter-organizational business processes and SOA-based PAISs that support them. It proposes a novel model-driven design method that generates SOA models expressed in SoaML taking the specification of collaborative business processes expressed in BPMN as input. We present the principles underlying the approach, the state of an ongoing implementation, and the results of two studies conducted to empirically validate the method in the context of ERP key processes. © 2018, Springer-Verlag London Ltd., part of Springer Nature.

  • 149.
    Blidkvist, Jesper
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Westgren, Joakim
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Re-texturing and compositing new material on pre-rendered media: Using DirectX and UV sampling2016Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Context: This thesis investigates a new method for re-texturing and com- positing new or additional material on specific pre-rendered images using various blend equations. This is done by sampling a number of render passes created alongside the original source material, most notably a UV pass for accurate texture positioning and different lighting passes to enhance the control over the final result. This will allow comparatively simple and cheap compositing without the overhead that other commercially available tools might add.

    Objectives: Render the necessary UV coordinates and lighting calculations from a 3D application to two separate textures.Sample said textures in DirectX and use the information to accurately light and position the additional dynamic material for blending with the pre-rendered media.

    Method: The thesis uses an implementation method in which quantita- tive data is gathered by comparing the resulting composited images using two common image comparison methods, the Structured Similarity Index (SSIM) and Peak Signal to Noise Ratio (PSNR), against a Gold Standard render.

    Results: The results of this implementation indicates that both the per- ceived and measured similarity is close enough to prove the validity of this method. Conclusions. This thesis shows the possibility and practical use of DirectX as tool capable of the most fundamental compositing operations. In its current state, the implementation is limited in terms of flexibility and func- tionality when compared to other proprietary compositing software packages and some visual artefacts and quality issues are present. There are however no indications that these issues could not be solved with additional work. 

  • 150.
    Bloom, Filip
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Competitive Coevolution for micromanagement in StarCraft: Brood War2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Context. Interest in and research on neural networks and their capacity for finding solutions to nonlinear problems has increased greatly in recent years.

    Objectives. This thesis attempts to compare competitive coevolution to traditional neuroevolution in the game StarCraft: Brood War.

    Methods. Implementing and evolving AI-controlled players for the game StarCraft and evaluating their performance.

    Results. Fitness values and win rates against the default StarCraft AI and between the networks were gathered.

    Conclusions. The neural networks failed to improve under the given circumstances. The best networks performed on par with the default StarCraft AI.

1234567 101 - 150 of 1523
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf