Change search
Refine search result
1234 1 - 50 of 185
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Abghari, Shahrooz
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Data Modeling for Outlier Detection2018Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis explores the data modeling for outlier detection techniques in three different application domains: maritime surveillance, district heating, and online media and sequence datasets. The proposed models are evaluated and validated under different experimental scenarios, taking into account specific characteristics and setups of the different domains.

    Outlier detection has been studied and applied in many domains. Outliers arise due to different reasons such as fraudulent activities, structural defects, health problems, and mechanical issues. The detection of outliers is a challenging task that can reveal system faults, fraud, and save people's lives. Outlier detection techniques are often domain-specific. The main challenge in outlier detection relates to modeling the normal behavior in order to identify abnormalities. The choice of model is important, i.e., an incorrect choice of data model can lead to poor results. This requires a good understanding and interpretation of the data, the constraints, and the requirements of the problem domain. Outlier detection is largely an unsupervised problem due to unavailability of labeled data and the fact that labeled data is expensive.

    We have studied and applied a combination of both machine learning and data mining techniques to build data-driven and domain-oriented outlier detection models. We have shown the importance of data preprocessing as well as feature selection in building suitable methods for data modeling. We have taken advantage of both supervised and unsupervised techniques to create hybrid methods. For example, we have proposed a rule-based outlier detection system based on open data for the maritime surveillance domain. Furthermore, we have combined cluster analysis and regression to identify manual changes in the heating systems at the building level. Sequential pattern mining for identifying contextual and collective outliers in online media data have also been exploited. In addition, we have proposed a minimum spanning tree clustering technique for detection of groups of outliers in online media and sequence data. The proposed models have been shown to be capable of explaining the underlying properties of the detected outliers. This can facilitate domain experts in narrowing down the scope of analysis and understanding the reasons of such anomalous behaviors. We have also investigated the reproducibility of the proposed models in similar application domains.

  • 2. Acevedo Peña, Carlos Gonzalo
    Developing Inclusive Innovation Processes and Co-Evolutionary Approaches in Bolivia2015Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The concept of National Innovation Systems (NIS) has been widely adopted in developing countries, particularly in Latin American countries, for the last two decades. The concept is used mainly as an ex-ante framework to organize and increase the dynamics of those institutions linked to science, technology and innovation, for catching-up processes of development. In the particular case of Bolivia, and after several decades of social and economic crisis, the promise of a national innovation system reconciles a framework for collaboration between the university, the government and the socio-productive sectors. Dynamics of collaboration generated within NIS can be a useful tool for the pursuit of inclusive development ambitions.

     

    This thesis is focused on inclusive innovation processes and the generation of co-evolutionary processes between university, government and socio-productive sectors. This is the result of 8 years of participatory action research influenced by Mode 2 knowledge-production and Technoscientific approaches.

     

    The study explores the policy paths the Bolivian government has followed in the last three decades in order to organize science, technology and innovation. It reveals that Bolivia has an emerging national innovation system, where its demand-pulled innovation model presents an inclusive approach. Innovation policy efforts in Bolivia are led by the Vice-Ministry of Science and Technology (VCyT). Moreover, NIS involves relational and collaborative approaches between institutions, which imply structural and organizational challenges, particularly for public universities, as they concentrate most of the research capabilities in the country. These universities are challenged to participate in NIS within contexts of weak demanding sectors. 

     

    This research focuses on the early empirical approaches and transformations at Universidad Mayor de San Simón (UMSS) in Cochabamba. The aim to strengthen internal innovation capabilities of the university and enhance the relevance of research activities in society by supporting socio-economic development in the framework of innovation systems is led by the Technology Transfer Unit (UTT) at UMSS. UTT has become a recognized innovation facilitator unit, inside and outside the university, by proposing pro-active initiatives to support emerging innovation systems. Because of its complexity, the study focuses particularly on cluster development promoted by UTT. Open clusters are based on linking mechanisms between the university research capabilities, the socio-productive actors and government. Cluster development has shown to be a practical mechanism for the university to meet the demanding sector (government and socio-productive actors) and to develop trust-based inclusive innovation processes. The experiences from cluster activities have inspired the development of new research policies at UMSS, with a strong orientation to foster research activities towards an increased focus on socio-economic development. The experiences gained at UMSS are discussed and presented as a “developmental university” approach.

     

    Inclusive innovation processes with co-evolutionary approaches seem to constitute an alternative path supporting achievement of inclusive development ambitions in Bolivia. 

  • 3. Afzal, Wasif
    Search-based approaches to software fault prediction and software testing2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software verification and validation activities are essential for software quality but also constitute a large part of software development costs. Therefore efficient and cost-effective software verification and validation activities are both a priority and a necessity considering the pressure to decrease time-to-market and intense competition faced by many, if not all, companies today. It is then perhaps not unexpected that decisions related to software quality, when to stop testing, testing schedule and testing resource allocation needs to be as accurate as possible. This thesis investigates the application of search-based techniques within two activities of software verification and validation: Software fault prediction and software testing for non-functional system properties. Software fault prediction modeling can provide support for making important decisions as outlined above. In this thesis we empirically evaluate symbolic regression using genetic programming (a search-based technique) as a potential method for software fault predictions. Using data sets from both industrial and open-source software, the strengths and weaknesses of applying symbolic regression in genetic programming are evaluated against competitive techniques. In addition to software fault prediction this thesis also consolidates available research into predictive modeling of other attributes by applying symbolic regression in genetic programming, thus presenting a broader perspective. As an extension to the application of search-based techniques within software verification and validation this thesis further investigates the extent of application of search-based techniques for testing non-functional system properties. Based on the research findings in this thesis it can be concluded that applying symbolic regression in genetic programming may be a viable technique for software fault prediction. We additionally seek literature evidence where other search-based techniques are applied for testing of non-functional system properties, hence contributing towards the growing application of search-based techniques in diverse activities within software verification and validation.

  • 4.
    Ahmadi Mehri, Vida
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Towards Secure Collaborative AI Service Chains2019Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    At present, Artificial Intelligence (AI) systems have been adopted in many different domains such as healthcare, robotics, automotive, telecommunication systems, security, and finance for integrating intelligence in their services and applications. The intelligent personal assistant such as Siri and Alexa are examples of AI systems making an impact on our daily lives. Since many AI systems are data-driven systems, they require large volumes of data for training and validation, advanced algorithms, computing power and storage in their development process. Collaboration in the AI development process (AI engineering process) will reduce cost and time for the AI applications in the market. However, collaboration introduces the concern of privacy and piracy of intellectual properties, which can be caused by the actors who collaborate in the engineering process.  This work investigates the non-functional requirements, such as privacy and security, for enabling collaboration in AI service chains. It proposes an architectural design approach for collaborative AI engineering and explores the concept of the pipeline (service chain) for chaining AI functions. In order to enable controlled collaboration between AI artefacts in a pipeline, this work makes use of virtualisation technology to define and implement Virtual Premises (VPs), which act as protection wrappers for AI pipelines. A VP is a virtual policy enforcement point for a pipeline and requires access permission and authenticity for each element in a pipeline before the pipeline can be used.  Furthermore, the proposed architecture is evaluated in use-case approach that enables quick detection of design flaw during the initial stage of implementation. To evaluate the security level and compliance with security requirements, threat modeling was used to identify potential threats and vulnerabilities of the system and analyses their possible effects. The output of threat modeling was used to define countermeasure to threats related to unauthorised access and execution of AI artefacts.

  • 5.
    Andreasson, Eskil
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Realistic Package Opening Simulations: An Experimental Mechanics and Physics Based Approach2015Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    A finite element modeling strategy targeting package opening simulations is the final goal with this work. The developed simulation model will be used to proactively predict the opening compatibility early in the development process of a new opening device and/or a new packaging material. To be able to create such a model, the focus is to develop a combined and integrated physical/virtual test procedure for mechanical characterization and calibration of thin packaging materials. Furthermore, the governing mechanical properties of the materials involved in the opening performance needs to be identified and quantified with experiments. Different experimental techniques complemented with video recording equipment were refined and utilized during the course of work. An automatic or semi-automatic material model parameter identification process involving video capturing of the deformation process and inverse modeling is proposed for the different packaging material layers. Both an accurate continuum model and a damage material model, used in the simulation model, were translated and extracted from the experimental test results. The results presented show that it is possible to select constitutive material models in conjunction with continuum material damage models, adequately predicting the mechanical behavior of intended failure in thin laminated packaging materials. A thorough material mechanics understanding of individual material layers evolution of microstructure and the micro mechanisms involved in the deformation process is essential for appropriate selection of numerical material models. Finally, with a slight modification of already available techniques and functionalities in the commercial finite element software AbaqusTM it was possible to build the suitable simulation model. To build a realistic simulation model an accurate description of the geometrical features is important. Therefore, advancements within the experimental visualization techniques utilizing a combination of video recording, photoelasticity and Scanning Electron Microscopy (SEM) of the micro structure have enabled extraction of geometries and additional information from ordinary standard experimental tests. Finally, a comparison of the experimental opening and the virtual opening, showed a good correlation with the developed finite element modeling technique. The advantage with the developed modeling approach is that it is possible to modify the material composition of the laminate. Individual material layers can be altered and the mechanical properties, thickness or geometrical shape can be changed. Furthermore, the model is flexible and a new opening device i.e. geometry and load case can easily be adopted in the simulation model. Therefore, this type of simulation model is a useful tool and can be used for decision support early in the concept selection of development projects.

  • 6. Ankre, Rosemarie
    Understanding the visitor – a prerequisite for coastal zone planning2007Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Planning for tourism and outdoor recreation in Swedish coastal areas could be improved with knowledge of visitors’ attitudes, experiences, activities and geographical dispersion. The purpose of this thesis is to examine the knowledge of visitors in planning for tourism and outdoor recreation. The Luleå archipelago in Northern Sweden is used as a case study. Supervisors: Professor Lars Emmelin, Blekinge Institute of Technology/ETOUR Dr Peter Fredman, ETOUR. The Department of Spatial Planning, BTH deals with research on planning processes, environmental impact assessment, social issues, gender issues and applied information technology in spatial planning. The European Tourism Research Institute, ETOUR, Mid Sweden University, develops knowledge and expertise within issues related to travel and tourism. There are three main objectives: to conduct research on tourism-related issues, to analyse statistics on tourism and to make the research results accessible to the tourism industry. The research aims to develop the tourism industry and the institute is a resource for businesses, organisations and authorities. This project has been financed by the AGORA Interreg III-project Network Sustainable Tourism Development in the Baltic Sea Region, the Blekinge County Administration Board, the Mid Sweden University in Östersund, the European Tourism Research Institute (ETOUR), and The Swedish Tourist Authority.

  • 7. Aziz, Hussein Muzahim
    Enhancing the Smoothness of Streaming Video for Mobile Users over Unreliable Networks2010Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Real time video streaming over wireless network is an increasingly important and attractive service to the mobile users. Video streaming involves a large amount of data to be transmitted in real time, while wireless channel conditions may vary from time to time. It is hard to guarantee a reliable transmission over the wireless network, where the parameters specifying the transmissions are; bandwidth, packet loss, packet delays, and outage times. The quality of the video is affected negatively when network packets are lost, and the mobile users may notice some sudden stop during the video playing; the picture is momentarily frozen, followed by a jump from one scene to a totally different one. The main objective of this thesis is to provide a smooth video playback in the mobile device over unreliable networks with a satisfactory video quality. Three different techniques are proposed to achieve this goal. The first technique will stream duplicate gray scale frames over multichannels, if there is lost frames in one channel it can be recovered from another channel. In the second technique, each video frame will be split into sub-frames. The splitted sub-frames will be streamed over multichannels. If there is a missing sub-frame during the transmission a reconstruction mechanism will be applied in the mobile device to recreate the missing sub-frames. In the third technique, we propose a time interleaving robust streaming (TIRS) technique to stream the video frames in different order. The benefit of that is to avoid the losses of a sequence of neighbouring frames. A missing frame from the streaming video will be reconstructed based on the surrounding frames. The mean opinion score (MOS) metric is used to evaluate the video quality. The experienced quality of a video is subject to the personal opinion, which is the only goal to satisfy the average human watching the contents of the video.

  • 8. Baca, Dejan
    Automated static code analysis: A tool for early vulnerability detection2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software vulnerabilities are added into programs during its development. Architectural flaws are introduced during planning and design, while implementation faults are created during coding. Penetration testing is often used to detect these vulnerabilities. This approach is expensive because it is performed late in development and any correction would increase lead-time. An alternative would be to detect and correct vulnerabilities in the phase of development where they are the least expensive to correct and detect. Source code audits have often been suggested and used to detect implementations vulnerabilities. However, manual audits are time consuming and require extended expertise to be efficient. A static code analysis tool could achieve the same results as a manual audit but at fraction of the time. Through a set of cases studies and experiments at Ericsson AB, this thesis investigates the technical capabilities and limitations of using a static analysis tool as an early vulnerability detector. The investigation is extended to studying the human factor by examining how the developers interact and use the static analysis tool. The contributions of this thesis include the identification of the tools capabilities so that further security improvements can focus on other types of vulnerabilities. By using static analysis early in development possible cost saving measures are identified. Additionally, the thesis presents the limitations of static code analysis. The most important limitation being the incorrect warnings that are reported by static analysis tools. In addition, a development process overhead was deemed necessary to successfully use static analysis in an industry setting.

  • 9.
    Badampudi, Deepika
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Towards decision-making to choose among different component origins2016Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Context: The amount of software in solutions provided in various domains is continuously growing. These solutions are a mix of hardware and software solutions, often referred to as software-intensive systems. Companies seek to improve the software development process to avoid delays or cost overruns related to the software development.  

    Objective: The overall goal of this thesis is to improve the software development/building process to provide timely, high quality and cost efficient solutions. The objective is to select the origin of the components (in-house, outsource, components off-the-shelf (COTS) or open source software (OSS)) that facilitates the improvement. The system can be built of components from one origin or a combination of two or more (or even all) origins. Selecting a proper origin for a component is important to get the most out of a component and to optimize the development. 

    Method: It is necessary to investigate the component origins to make decisions to select among different origins. We conducted a case study to explore the existing challenges in software development.  The next step was to identify factors that influence the choice to select among different component origins through a systematic literature review using a snowballing (SB) strategy and a database (DB) search. Furthermore, a Bayesian synthesis process is proposed to integrate the evidence from literature into practice.  

    Results: The results of this thesis indicate that the context of software-intensive systems such as domain regulations hinder the software development improvement. In addition to in-house development, alternative component origins (outsourcing, COTS, and OSS) are being used for software development. Several factors such as time, cost and license implications influence the selection of component origins. Solutions have been proposed to support the decision-making. However, these solutions consider only a subset of factors identified in the literature.   

    Conclusions: Each component origin has some advantages and disadvantages. Depending on the scenario, one component origin is more suitable than the others. It is important to investigate the different scenarios and suitability of the component origins, which is recognized as future work of this thesis. In addition, the future work is aimed at providing models to support the decision-making process.

  • 10.
    Bakhtyar, Shoaib
    Blekinge Institute of Technology, School of Computing.
    On the Synergies Between an Electronic Waybill and Intelligent Transport Systems Services2013Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The main purpose of this thesis is to investigate potential synergies between an electronic waybill (e-Waybill) and other Intelligent Transport System (ITS) services. An e-Waybill service, as presented in this thesis, should be able to perform the functions of a paper waybill (which is an important transport document and contains essential information about a consignment) and it should contain at least the information specified in a paper waybill. To investigate synergies between the e-Waybill solutions and ITS services, we present 5 conceptual e-Waybill solutions that differ in where the e-Waybill information is stored, read, and written. These solutions are further investigated for functional and technical requirements (non-functional requirements), which can potentially impose constraints on a potential system that should implement the e-Waybill service. A set of 20 ITS services are considered for synergy analysis in this thesis. These services are mainly for road transport, however most of them are relevant to be considered for utilization in other modes of transport as well. For information synergy analysis, the e-Waybill solutions are assessed based on their synergies with ITS services. For different ITS services, the required input information entities are identified; and if at least one information entity can be provided by an e-Waybill at the right location we regard it to be a synergy. The result from our synergy analysis may support the choice of practical e-Waybill systems, which has the possibility to provide high synergy with ITS services. This may lead to a higher utilization of ITS services and more sustainable transport, e.g., in terms of reduced congestion and emissions. Additionally, a service design method has been proposed for supporting the process of designing new ITS services, which primarily utilizes on functional synergies with already existing ITS services. In order to illustrate the usage of the suggested method, we have applied it for designing a new ITS service, i.e., the Liability Intelligent Transport System (LITS) service. The purpose of the LITS service is to support the process of identifying when, where, and by whom a consignment has been damaged and who was responsible when the consignment was damaged.

  • 11. Barney, Sebastian
    Perspectives on Software and their Priorities: Balancing Conflicting Stakeholder Views2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The sustainable development of a software product depends on a number of groups working together to achieve a common goal. However, each of the groups interacts with the product in different ways, and can have conflicting aims and objectives. For example, developers trying to correct issues in the software architecture, which will impact future releases of the product, can be stopped by a project manager who is charged with delivering a release on time and within budget. While the functional requirements of a software product are usually documented, there are a number of other investments in software development that are not always as explicitly agreed upon but are still essential to a product's long-term success. The major investment types include software product quality -- a main control variable in software development, and intellectual capital (IC) -- being the key input and tool used in software development. As management requires measurement, it is necessary understand the priorities placed on investment options by the various groups involved in the development of a software product. The objective of this thesis is to develop a method capable of both determining the priorities of different groups, and the level of alignment between these groups in terms of their priorities. Evolving the method from a study into the values used to select requirements for a release of software, Ericsson supported the development of a methodology to determine and compare the priorities of different groups for software product quality, and IC. The method elicited the required information from a series of case studies to build up a picture of the priorities placed on major investment options and constraints -- features, quality, IC, time and cost. The results highlight strengths, and areas for improvement -- through the identification of differing priorities and ambiguities in management of different aspects studied. In conducting this research, systematic biases in the selection of requirements appear to be occurring, adding an objective to understand how bias impacts decision making in a requirements engineering context. This thesis provides a method that determines the priorities on the level of investment on different options in the development of software products. It is concluded that people involved in the development of software need to be aligned on issues of software product quality as these priorities set expectations. The same was not found true for issues of IC, where groups can complete tasks without negatively impacting others, as long as the organisation works effectively as a single entity. On the issue of biases in the prioritisation of these aspects, prospect theory is found to apply to requirements selection in an academic experiment -- suggesting people will prefer functionality over software product quality, and to meet the known requirements of customers over predicting general market requirements.

  • 12. Berander, Patrik
    Prioritization of Stakeholder Needs in Software Engineering: Understanding and Evaluation2004Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    In everyday life, humans confront situations where different decisions have to be made. Such decisions can be non-trivial even though they often are relatively simple, such as which bus to take or which flavor of a soft drink to buy. When facing decisions of more complex nature, and when more is at stake, they tend to get much harder. It is often possible to deal with such decisions by prioritizing different alternatives to find the most suitable one. In software engineering, decision-makers are often confronted with situations where complex decisions have to be made, and where the concept of prioritization can be utilized. Traditionally in software engineering, discussions about prioritization have focused on the software product. However, when defining or improving software processes, complex decisions also have to be made. In fact, software products and software processes have many characteristics in common which invite thoughts about using prioritization when developing and evolving software processes as well. The results presented in this thesis indicate that it is possible to share results and knowledge regarding prioritization between the two areas. In this thesis, the area of prioritization of software products is investigated in detail and a number of studies where prioritizations are performed in both process and product settings are presented. It is shown that it is possible to use prioritization techniques commonly used in product development also when prioritizing improvement issues in a software company. It is also shown that priorities between stakeholders of a software process sometimes differ, just as they do when developing software products. The thesis also presents an experiment where different prioritization techniques are evaluated with regard to ease of use, time consumption, and accuracy. Finally, an investigation of the suitability of students as subjects when evaluating prioritization techniques is presented.

  • 13.
    bin Ali, Nauman
    Blekinge Institute of Technology, School of Computing.
    Towards Guidelines for Conducting Software Process Simulation in Industry2013Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Background: Since the 1950s explicit software process models have been used for planning, executing and controlling software development activities. To overcome the limitation of static models at capturing the inherent dynamism in software development, Software Process Simulation Modelling (SPSM) was introduced in the late 1970s. SPSM has been used to address various challenges, e.g. estimation, planning and process assessment. The simulation models developed over the years have varied in their scope, purpose, approach and the application domain. However, there is a need to aggregate the evidence regarding the usefulness of SPSM for achieving its intended purposes. Objective: This thesis aims to facilitate adoption of SPSM in industrial practice by exploring two directions. Firstly it aims to establish the usefulness of SPSM for its intended purposes, e.g. for planning, training and as an alternative to study the real world software (industrial and open source) development. Secondly to define and evaluate a process for conducting SPSM studies in industry. Method: Two systematic literature reviews (SLR), a literature review, a case study and an action research study were conducted. A literature review of existing SLRs was done to identify the strategies for selecting studies. The resulting process for study selection was utilized in an SLR to capture and aggregate evidence regarding the usefulness of SPSM. Another SLR was used to identify existing process descriptions of how to conduct an SPSM study. The consolidated process and associated guidelines identified in this review were used in an action research study to develop a simulation model of the testing process in a large telecommunication vendor. The action research was preceded by a case study to understand the testing process at the company. Results: A study selection process based on the strategies identified from literature was proposed. It was found to systemize selection and to support inclusiveness with reasonable additional effort in an SLR of the SPSM literature. The SPSM studies identified in literature scored poorly on the rigor and relevance criteria and lacked evaluation of SPSM for the intended purposes. Lastly, based on literature, a six-step process to conduct an SPSM study was used to develop a System Dynamics model of the testing process for training purposes in the company. Conclusion: The findings identify two potential directions for facilitating SPSM adoption. First, by learning from other disciplines having done simulation for a longer time. It was evident how similar the consolidated process for conducting an SPSM study was to the process used in simulation in general. Second the existing work on SPSM can at best be classified as strong ``proof-of-concept’’ that SPSM can be useful in the real world software development. Thus, there is a need to evaluate and report the usefulness of SPSM for the intended purposes with scientific rigor.

  • 14. Bjarnadóttír, Hólmfríður
    SEA in the Context of Land-Use Planning: The application of the EU directive 2001/42/EC to Sweden, Iceland and England2008Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The thesis addresses the introduction of a supra-national instrument; a European directive on Strategic Environmental Assessment (SEA) into national contexts of land-use planning in three countries; Sweden, Iceland and England. The directive ”On the assessment of the effects of certain plans and programmes on the environment” was agreed upon by the European Commission on the 21st of June 2001 and was to be transposed to national legislation by 21st of June 2004. The introduction of these requirements meant that the countries needed to make legal adjustments and implement it at the different levels of planning. Many EU member countries, including those studied in the thesis, had some experience of environmental assessment of plans and programmes prior to the introduction of the SEA directive. SEA has as a concept and a tool in planning in national and international debate on Environmental Assessment and planning for the last two decades. Hence, the SEA directive was introduced to an existing context of environmental assessment in planning and the preparation of the directive has drawn on substantial conceptual development and practical experience of strategic environmental assessment in various forms. The aim of this research is to shed a light on the transposition of the SEA directive into a national legal framework and how the introduction relates to the countries’ planning contexts and previous application of SEA-like instruments. In the thesis an overview is given of the way the directive is transposed to the national legal system of the three countries and the existing planning framework is described. The results from the national reviews are analysed in relation to the contents of the directive and the international and Nordic academic debate regarding the purpose and role of SEA, related to the characteristics of the planning system. The research shows differences in the legal and planning contexts to which the SEA requirements have been introduced in the three countries. Despite of those, the legal requirements follow closely the contents of the directive. However, the expectations towards the directive expressed by national officials and politicians, the recommendations in the way the legal SEA requirements shall be implemented, differ between the countries as well as references to other processes; land-use planning and the practices of Environmental Impact Assessment and Sustainability Appraisal. The thesis is the result of a project within the interdisciplinary research programme MiSt, “Tools for environmental assessment in strategic decision making” at BTH funded by the Swedish Environmental Protection Agency. The project has been carried out at Nordregio, the Nordic Centre for Spatial Development, Stockholm.

  • 15. Boldt, Martin
    Privacy-Invasive Software: Exploring Effects and Countermeasures2007Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    As computers are increasingly more integrated into our daily lives, we need aiding mechanisms for separating legitimate software from their unwanted counterparts. We use the term Privacy-Invasive Software (PIS) to refer to such illegitimate software, sometimes loosely labelled as spyware. In this thesis, we include an introduction to PIS, and how it differs from both legitimate and traditionally malicious software. We also present empirical measurements indicating the effects that PIS have on infected computers and networks. An important contribution of this work is a classification of PIS in which we target both the level of user consent, as well as the degree of user consequences associated with PIS. These consequences, affecting both users and their computers, form a global problem that deteriorates a vast number of users’ computer experiences today. As a way to hinder, or at least mitigate, this development we argue for more user-oriented countermeasures that focus on informing users about the behaviour and consequences associated with using a particular software. In addition to current reactive countermeasures, we also need preventive tools dealing with the threat of PIS before it enters users’ computers. Collaborative reputation systems present an interesting way forward towards such preventive and user-oriented countermeasures against PIS. Moving the software reputations from old channels (such as computer magazines or friends’ recommendations) into an instantly fast reputation system would be beneficial for the users when distinguishing unwanted software from legitimate. It is important that such a reputation system is designed to address antagonistic intentions from both individual users and groups thereof, so that users could depend on the reputations. This would allow users to reach more informed decisions by taking the reported consequences into account when deciding whether they want a specific software to enter their computer or not.

  • 16.
    Borg, Anton
    Blekinge Institute of Technology, School of Computing.
    Decision Support for Estimation of the Utility of Software and E-mail2012Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Background: Computer users often need to distinguish between good and bad instances of software and e-mail messages without the aid of experts. This decision process is further complicated as the perception of spam and spyware varies between individuals. As a consequence, users can benefit from using a decision support system to make informed decisions concerning whether an instance is good or bad. Objective: This thesis investigates approaches for estimating the utility of e-mail and software. These approaches can be used in a personalized decision support system. The research investigates the performance and accuracy of the approaches. Method: The scope of the research is limited to the legal grey- zone of software and e-mail messages. Experimental data have been collected from academia and industry. The research methods used in this thesis are simulation and experimentation. The processing of user input, along with malicious user input, in a reputation system for software were investigated using simulations. The preprocessing optimization of end user license agreement classification was investigated using experimentation. The impact of social interaction data in regards to personalized e-mail classification was also investigated using experimentation. Results: Three approaches were investigated that could be adapted for a decision support system. The results of the investigated reputation system suggested that the system is capable, on average, of producing a rating ±1 from an objects correct rating. The results of the preprocessing optimization of end user license agreement classification suggested negligible impact. The results of using social interaction information in e-mail classification suggested that accurate spam detectors can be generated from the low-dimensional social data model alone, however, spam detectors generated from combinations of the traditional and social models were more accurate. Conclusions: The results of the presented approaches suggestthat it is possible to provide decision support for detecting software that might be of low utility to users. The labeling of instances of software and e-mail messages that are in a legal grey-zone can assist users in avoiding an instance of low utility, e.g. spam and spyware. A limitation in the approaches is that isolated implementations will yield unsatisfactory results in a real world setting. A combination of the approaches, e.g. to determine the utility of software, could yield improved results.

  • 17.
    Borén, Sven
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    Sustainable Personal Road Transport: The Role of Electric Vehicles2016Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Electric vehicles can play an important role in a future sustainable road transport system and many Swedish politicians would like to see them implemented faster. This is likely desirable to reach the target of a fossil independent vehicle fleet in Sweden by 2030 and a greenhouse gas neutral Swedish society no later than 2050. However, to reach both these targets, and certainly to support the full scope of sustainability, it is important to consider the whole life-cycle of the vehicles and also the interaction between the transport sector and other sectors. So far, there are no plans for transitions towards a sustainable transport system applying a sufficiently wide systems perspective, in Sweden or elsewhere. This implies a great risk for sub-optimizations.

    The overall aim of this work is to elaborate methodological support for development of sustainable personal road transport systems that is informed by a strategic sustainable development perspective.

    The Framework for Strategic Sustainable Development (FSSD) is used as a foundation for the work to ensure a sufficiently wide systems perspective and coordinated collaboration across disciplines and sectors, both in the research and application. Maxwell’s Qualitative Research Design and the Design Research Methodology are used as overall guides for the research approach. Specific research methods and techniques include literature studies, action research seminars, interviews, and measurements of energy use, costs, and noise. Moreover, a case study on the conditions for a breakthrough for vehicles in southeast Sweden has been used as a test and development platform.

    Specific results include a preliminary vision for electrical vehicles in southeast Sweden, framed by the principled sustainability definition of the FSSD, an assessment of the current reality in relation to that vision, and proposed solutions to bridge the gap, organized into a preliminary roadmap. The studies show that electric vehicles have several sustainability advantages even when their whole life-cycle is considered, provided that they are charged with electricity from new renewable sources. Electrical vehicles also imply a low total cost of ownership and could promote new local ‘green jobs’ under certain conditions. Particularly promising results are seen for electric buses in public transport. As a general result, partly based on the experiences from the specific case, a generic community planning process model is proposed and its usefulness for sustainable transport system development is discussed.

    The strategic sustainable development perspective of this thesis broadens the analysis beyond the more common focus on climate change issues and reduces the risk of sub-optimizations in community and transport system development. The generic support for multi-stakeholder collaboration could potentially also promote a more participatory democratic approach to community development, grounded in a scientific foundation. Future research will explore specific decision support systems for sustainable transport development based on the generic planning process model.

  • 18. Brandt, Patrik
    et al.
    Wennberg, Louise
    Informatisk forskning om riskanalysprocess applicerad på Apoteket AB:s kundcenterverksamhet2004Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Denna licentiatuppsats skildrar en studie av hur introduktionen av kundcenterverksamhet inom Apoteket AB påverkar utförandet av och arbetet med organisationens riskanalys. De två kundcenter som har introducerats i verksamheten fungerar idag som centrala noder, till vilka samtal kopplas som vanligtvis tidigare besvarades på något av landets 900 apotek. Vidare kan kundcentren erbjuda ytterligare kommunikationskanaler såsom fax, e-post och Internet via Apotekets webbplats i syfte att öka tillgängligheten för kunder och övriga intressenter samt att möta de sätt som information eftersöks i dagens informationssamhälle. Integrationen med Apotekets övriga system erbjuder korta svarstider och en möjlighet för kunderna att handha större delar av sina ärenden själva. Vad som dock lätt förbises är hur informationssäkerheten påverkas vid införande av kundcenterverksamhet och åtföljande integrering, samt vilka hot och risker detta innebär för organisationen och indirekt även för kunderna. Ett värdefullt verktyg som används i informationssäkerhetsarbetet med att försöka förutse samt inringa en så stor andel hot och risker som möjligt är riskanalysen. Idag märks också en ökad medvetenhet bland allmänheten vad gäller risker och hot relaterade till bland annat Internet, både vad anbelangar privat och företags/organisationsrelaterad användning. Detta har medfört att företag och organisationer har insett vikten av att bedriva ett aktivt informationssäkerhetsarbete för att kunna leverera lämplig säkerhetsnivå. Det är av stor vikt att kunderna kan känna samma tillit till företagets eller organisationens varumärke som tidigare, oavsett vilka organisatoriska eller tekniska förändringar som har skett internt. En viktig del i processen med att uppnå detta är riskanalysen och de resultat den frambringar. I denna studie har vi uppmärksammat behovet av att anpassa den i organisationen använda riskanalysen till den tillkomna kundcenterverksamheten. Detta behov är stort, särskilt då kommunikationskanalerna är av olika natur vilket gör att deras respektive hotbilder är olika. Det breda spektrumet av hotbilder måste således uppmärksammas i riskanalysen och arbetet med denna. En utveckling av den omkringliggande miljön skall också återspeglas i riskanalysen som följaktligen behöver vara dynamisk. Betydelsen av att placera riskanalysen i ett holistiskt sammanhang tilldelas stor vikt i denna studie.

  • 19. Bratt, Cecilia
    Assessment of Eco-Labelling and Green Procurement from a Strategic Sustainability Perspective2011Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Efforts to reduce negative impacts from consumption and production include voluntary market-based initiatives. Examples are the concept of eco-labelling and the concept of green procurement. These have emerged as policy instruments with great potentials to steer product innovation and purchasing decisions in a sustainable direction. This potential has been recognized by the United Nations, the European Union, the Organization for Economic Cooperation and Development and national governments through, e.g., various programmes and schemes. The aim of this thesis is to assess current criteria development processes within eco-labelling and green procurement from a strategic sustainability perspective and to describe possible improvement potentials from such a perspective to make these instruments more supportive of sustainable product and service innovation. A previously published framework for strategic sustainable development, including a definition of sustainability and generic guidelines to inform strategies towards sustainability, is adapted and used for this purpose. Criteria development processes in two Swedish eco-labelling programmes and at a governmental expert body for green procurement are studied. This includes interviews with criteria developers, studies of process documents and a case study at the governmental expert body for green procurement in which two criteria development processes were shadowed. The result reveals several strengths but also gaps and thus potentials for improvement. The criteria development processes and the resulting criteria mostly concern the current market supply and a selection of current environmental impacts outside the context of long-term objectives. Neither sustainability nor any other clearly defined long-term objective is agreed upon, and the criteria are not structured to support procurers, suppliers and product developers in a systematic and strategic stepwise approach towards sustainability. Recommended improvements include a more thorough sustainability assessment, communication of clearer objectives, broader competence in the criteria development groups and more emphasis on the dialogue and interaction between key actors. This includes an extended view on both the product concept and actors involved. Based on this, a new criteria development prototype is suggested, which aims at widening the scope from some currently known product impacts to the remaining gap to sustainability. During its further development and implementation, the criteria development prototype will be tested in successive iterations of action research together with experienced practitioners within eco-labelling and green procurement.

  • 20.
    Britto, Ricardo
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Knowledge Classification for Supporting Effort Estimation in Global Software Engineering Projects2015Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Background: Global Software Engineering (GSE) has become a widely applied operational model for the development of software systems; it can increase profits and decrease time-to-market. However, there are many challenges associated with development of software in a globally distributed fashion. There is evidence that these challenges affect many process related to software development, such as effort estimation. To the best of our knowledge, there are no empirical studies to gather evidence on effort estimation in the GSE context. In addition, there is no common terminology for classifying GSE scenarios focusing on effort estimation.

    Objective: The main objective of this thesis is to support effort estimation in the GSE context by providing a taxonomy to classify the existing knowledge in this field.

    Method: Systematic literature review (to identify and analyze the state of the art), survey (to identify and analyze the state of the practice), systematic mapping (to identify practices to design software engineering taxonomies), and literature survey (to complement the states of the art and practice) were the methods employed in this thesis.

    Results: The results on the states of the art and practice show that the effort estimation techniques employed in the GSE context are the same techniques used in the collocated context. It was also identified that global aspects, e.g. time, geographical and social-cultural distances, are accounted for as cost drivers, although it is not clear how they are measured. As a result of the conducted mapping study, we reported a method that can be used to design new SE taxonomies. The aforementioned results were combined to extend and specialize an existing GSE taxonomy, for suitability for effort estimation. The usage of the specialized GSE effort estimation taxonomy was illustrated by classifying 8 finished GSE projects. The results show that the specialized taxonomy proposed in this thesis is comprehensive enough to classify GSE projects focusing on effort estimation.

    Conclusions: The taxonomy presented in this thesis will help researchers and practitioners to report new research on effort estimation in the GSE context; researchers and practitioners will be able to gather evidence, com- pare new studies and find new gaps in an easier way. The findings from this thesis show that more research must be conducted on effort estimation in the GSE context. For example, the way the cost drivers are measured should be further investigated. It is also necessary to conduct further research to clarify the role and impact of sourcing strategies on the effort estimates’ accuracies. Finally, we believe that it is possible to design an instrument based on the specialized GSE effort estimation taxonomy that helps practitioners to perform the effort estimation process in a way tailored for the specific needs of the GSE context.

  • 21. Carlsson, Patrik
    Multi-Timescale Modelling of Ethernet Traffic2003Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Ethernet is one of the most common link layer technologies, used in local area networks, wireless networks and wide area networks. There is however a lack of traffic models for Ethernet that is usable in performance analysis. In this thesis we describe an Ethernet traffic model. The model aims at matching multiple moments of the bit rate at several timescales. To match the model parameters to measured traffic, four methods have been developed and tested on real traffic traces. Once a model has been created, it can be used directly in a fluid flow performance analysis. Our results show that, as the number of sources present on an Ethernet link grows, the model becomes better and less complex.

  • 22. Chen, Jiandan
    A Multi Sensor System for a Human Activities Space: Aspects of Planning and Quality Measurement2008Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    In our aging society, the design and implementation of a high-performance autonomous distributed vision information system for autonomous physical services become ever more important. In line with this development, the proposed Intelligent Vision Agent System, IVAS, is able to automatically detect and identify a target for a specific task by surveying a human activities space. The main subject of this thesis is the optimal configuration of a sensor system meant to capture the target objects and their environment within certain required specifications. The thesis thus discusses how a discrete sensor causes a depth spatial quantisation uncertainty, which significantly contributes to the 3D depth reconstruction accuracy. For a sensor stereo pair, the quantisation uncertainty is represented by the intervals between the iso-disparity surfaces. A mathematical geometry model is then proposed to analyse the iso-disparity surfaces and optimise the sensors’ configurations according to the required constrains. The thesis also introduces the dithering algorithm which significantly reduces the depth reconstruction uncertainty. This algorithm assures high depth reconstruction accuracy from a few images captured by low-resolution sensors. To ensure the visibility needed for surveillance, tracking, and 3D reconstruction, the thesis introduces constraints of the target space, the stereo pair characteristics, and the depth reconstruction accuracy. The target space, the space in which human activity takes place, is modelled as a tetrahedron, and a field of view in spherical coordinates is proposed. The minimum number of stereo pairs necessary to cover the entire target space and the arrangement of the stereo pairs’ movement is optimised through integer linear programming. In order to better understand human behaviour and perception, the proposed adaptive measurement method makes use of a fuzzily defined variable, FDV. The FDV approach enables an estimation of a quality index based on qualitative and quantitative factors. The suggested method uses a neural network as a tool that contains a learning function that allows the integration of the human factor into a quantitative quality index. The thesis consists of two parts, where Part I gives a brief overview of the applied theory and research methods used, and Part II contains the five papers included in the thesis.

  • 23. Chevul, Stefan
    On Application-Perceived Quality of Service in Wireless Networks2006Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Wireless and Mobile Internet have changed the way people and businesses operate. Communication from any Internet access point, including wireless networks such as UMTS, GPRS or WLAN has enabled organizations to have a mobile workforce. However, networked applications such as web, email, streaming multimedia etc. rely upon the ability of timely data delivery. The achievable throughput is a quality measure for the very task of a communication system, which is to transport data in time. Throughput is thus one of the most essential enablers for networked applications. While in general, throughput is defined on network or transport level, the application-perceived throughput reflects the Quality of Service from the viewpoints of the application and user. The focus of the thesis is on the influence of the network on the applicationperceived Quality of Service and thus the user perceived experience. An analysis of application based active measurements mimicking the needs of streaming applications is presented. The results reveal clear influence of the network on the application-perceived Quality of Service seen from variations of application-perceived throughput on small time scales. Results also indicate that applications have to cope with considerably large jitter when trying to use the nominal throughputs. It was observed that the GPRS network had considerable problems in delivering packets in the downstream direction even when the nominal capacity of the link was not reached. Finally, the thesis discusses the suitability of wireless networks for different mobile services, since the influence of the network on the application-perceived Quality of Service is of great significance when it comes to customer satisfaction. Therefore, application-perceived Quality of Service in wireless networks must also be considered by the mobile application programmer during the application development.

  • 24.
    Claesson, Lena
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Remote Electronic and Acoustic Laboratories in Upper Secondary Schools2014Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    During a substantial part of their time young people of today actually live in a virtual world. The medial evolution has also influenced education and today much research work basically concerns the transfer of the physical world into the virtual one. One example is laboratories in physical science that are available in virtual rooms. They enable studentsto sit at home in front of a computer and on screen watch and operate the physical equipment in the laboratory at school. It is a general agreement that laboratory lessons are necessary in subjects such as physics, chemistry and biology. Physical experiments provide a great way for students to learn more about nature and its possibilities as well as limitations. Experimental work can be provided bylaboratories in three different categories; 1) hands-on, 2) remote and 3) simulated. This thesis concerns the usage of remotely controlled laboratories in physics education at an upper secondary school. It is based on work carried out in a joint project between Katedralskolan (upper secondary school), Lund, Sweden, and Blekinge Institute of Technology (BTH). The object with this project is to investigate feasibility of using the VISIR (Virtual Instruments System in Reality) technology for remotely controlled laboratories, developed at BTH, in upper secondary schools. This thesis consists of an introduction, followed by three parts where the first part concerns the introduction of the remote lab to students and the usage of the remote lab by students at the upper secondary school, Katedralskolan. Both first year students and third year students carried out experiments using the remote lab. The second part concerns activities carried out by 2 teachers and 94 students using the remote laboratory VISIR. An integration of VISIR with the learning management system used at school is described. Teaching activities carried out by teachers at Katedralskolan involving the VISIR lab are discussed, e.g., an exam including problems of experimental work using the VISIR lab and an example of a student report. Survey results on student satisfaction with the VISIR lab at BTH and the perception of it are presented, indicating that VISIR is a good learning tool. Furthermore, the survey resulted in a proposal of improvements in the VISIR lab user interface. Finally, the third part focuses on enhancements of the VISIR lab at BTH. An improved version in the VISIR user interface is presented. New iPad and smart phone availability of the VISIR lab is presented. Electronic experiments for upper secondary school students are described in detail and examples of suitable configurations are given. A new VISIR acoustic lab has beenimplemented and initial experimentation by upper secondary school students have been carried out. The outcomes from these experiments are discussed.

  • 25.
    Constantinescu, Doru
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Measurements and Models of One-Way Transit Time in IP Routers2005Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The main goals of this thesis are towards an understanding of the delay process in best-effort Internet for both non-congested and congested networks. A novel measurement system is reported for delay measurements in IP routers, which follows specifications of the IETF RFC 2679. The system employs both passive measurements and active probing and offers the possibility to measure and analyze different delay components of a router, e.g., packet processing delay, packet transmission time and queueing delay at the output link. Dedicated application-layer software is used to generate UDP traffic with TCP-like characteristics. Pareto traffic models are used to generate self-similar traffic in the link. The reported results are in form of several important statistics regarding processing and queueing delays of a router, router delay for a single data flow, router delay for multiple data flows as well as end-to-end delay for a chain of routers. They confirm results reported earlier about the fact that the delay in IP routers is generally influenced by traffic characteristics, link conditions and, to some extent, details in hardware implementation and different IOS releases. The delay in IP routers may also occasionally show extreme values, which are due to improper functioning of the routers. Furthermore, new results have been obtained that indicate that the delay in IP routers shows heavy-tailed characteristics, which can be well modeled with the help of several distributions, either in the form of a single distribution or as a mixture of distributions. There are several components contributing to the OWTT in routers, i.e., processing delay, queueing delay and service time. The obtained results have shown that, e.g., the processing delay in a router can be well modeled with the Normal distribution, and the queueing delay is well modeled with a mixture of Normal distribution for the body probability mass and Weibull distribution for the tail probability mass. Furthermore, OWTT has several component delays and it has been observed that the component delay distribution that is most dominant and heavy-tailed has a decisive influence on OWTT.

  • 26. Cornelius, Per
    Subband Beamforming for Speech Enhancement within a Motorcycle helmet2005Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The increased mobility in society has led to a need for convenient mobile communication in many different type of environments. Environments such as a motorcycle helmet, engine rooms and most industrial sites share a common challenge in that they often offer significant acoustic background noise. Noise reduces the speech intelligibility and consequently limits the potential of mobile speech communications. Existing single channel solutions for speech enhancement may perform adequately when the level of noise is moderate. When the noise level becomes significant, additional use of the spatial domain in order to successfully perform speech enhancement is a potential solution. This is achieved by including several microphones in an array placed in the vicinity of the person speaking. A beamforming algorithm is hereby used to combine the microphones such that the desired speech signal is enhanced. The interest in using microphone arrays for broadband speech and audio processing has increased in recent years. There have been a considerable amount of interesting applications published using beamforming techniques for hands-free voice communication in cars, hearing-aids, teleconferencing and multimedia applications. Most of proposed solutions deal exclusively with environments where the noise is moderate. This thesis is a study of noise reduction in a helmet communication system on a moving motorcycle. The environment is analyzed under different driving conditions and a speech enhancement solution is proposed that operates successfully in all driving conditions. The motorcycle environment can exhibit extremely high noise levels, when driving at high speed, while it can produce a low noise levels at moderate speeds. This fact implies that different solutions are required. It is demonstrated in the thesis that a cascaded combination of a calibrated subband beamforming technique, together with a single channel solution provides good results at all noise levels. The proposed solution operates in the frequency domain, where all microphone signals are decomposed by a subband filter bank prior to the speech enhancement processing. Since the subband transformation is an important component of the overall system performance, a method for filter bank design is also provided in the thesis. The design is such that the aliasing effects in the transformations are minimized while a small delay of the total system is maintained.

  • 27. Damm, Lars-Ola
    Monitoring and Implementing Early and Cost-Effective Software Fault Detection2005Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Avoidable rework constitutes a large part of development projects, i.e. 20-80 percent depending on the maturity of the organization and the complexity of the products. High amounts of avoidable rework commonly occur when having many faults left to correct in late stages of a project. In fact, research studies indicate that the cost of rework could be decreased by up to 30-50 percent by finding more faults earlier. However, since larger software systems have an almost infinite number of usage scenarios, trying to find most faults early through for example formal specifications and extensive inspections is very time-consuming. Therefore, such an approach is not cost-effective in products that do not have extremely high quality requirements. For example, in market-driven development, time-to-market is at least as important as quality. Further, some areas such as hardware dependent aspects of a product might not be possible to verify early through for example code reviews or unit tests. Therefore, in such environments, rework reduction is primarily about finding faults earlier to the extent it is cost-effective, i.e. find the right faults in the right phase. Through a set of case studies at a department at Ericsson AB, this thesis investigates how to achieve early and cost-effective fault detection through improvements in the test process. The case studies include investigations on how to identify which improvements that are most beneficial to implement, possible solutions to the identified improvement areas, and approaches for how to follow-up implemented improvements. The contributions of the thesis include a framework for component-level test automation and test-driven development. Additionally, the thesis provides methods for how to use fault statistics for identifying and monitoring test process improvements. In particular, we present results from applying methods that can quantify unnecessary fault costs and pinpointing which phases and activities to focus improvements on in order to achieve earlier and more cost-effective fault detection. The goal of the methods is to make organizations strive towards finding the right fault in the right test phase, which commonly is in early test phases. The developed methods were also used for evaluating the results of implementing the above-mentioned test framework at Ericsson AB. Finally, the thesis demonstrates how the implementation of such improvements can be continuously monitored to obtain rapid feedback on the status of defined goals. This was achieved through enhancements of previously applied fault analysis methods.

  • 28.
    Dasari, Siva Krishna
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science. Blekinge Institute of Technology.
    Tree Models for Design Space Exploration in Aerospace Engineering2019Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    A crucial issue in the design of aircraft components is the evaluation of a larger number of potential design alternatives. This evaluation involves too expensive procedures, consequently, it slows down the search for optimal design samples. As a result, scarce or small number of design samples with high dimensional parameter space and high non-linearity pose issues in learning of surrogate models. Furthermore, surrogate models have more issues in handling qualitative data (discrete) than in handling quantitative data (continuous). These issues bring the need for investigations of methods of surrogate modelling for the most effective use of available data. 

     The thesis goal is to support engineers in the early design phase of development of new aircraft engines, specifically, a component of the engine known as Turbine Rear Structure (TRS). For this, tree-based approaches are explored for surrogate modelling for the purpose of exploration of larger search spaces and for speeding up the evaluations of design alternatives. First, we have investigated the performance of tree models on the design concepts of TRS. Second, we have presented an approach to explore design space using tree models, Random Forests. This approach includes hyperparameter tuning, extraction of parameters importance and if-then rules from surrogate models for a better understanding of the design problem. With this presented approach, we have shown that the performance of tree models improved by hyperparameter tuning when using design concepts data of TRS. Third, we performed sensitivity analysis to study the thermal variations on TRS and hence support robust design using tree models. Furthermore, the performance of tree models has been evaluated on mathematical linear and non-linear functions. The results of this study have shown that tree models fit well on non-linear functions. Last, we have shown how tree models support integration of value and sustainability parameters data (quantitative and qualitative data) together with TRS design concepts data in order to assess these parameters impact on the product life cycle in the early design phase.

     

  • 29.
    de Petris, Linus
    Blekinge Institute of Technology, School of Planning and Media Design.
    Om glappen vi skapar och de märken som blir – materiell-diskursiva praktiker i kommunkontext2013Licentiate thesis, comprehensive summary (Other academic)
    Abstract [sv]

    Utgångspunkten för min forskning är ett upplevt glapp i IT- och e-förvaltningsutveckling. Policys och strategier har långtgående visioner om hur IT kan och kommer att förändra allt ifrån vårt vardagliga arbete och privatliv till att förändra samhällsstrukturer och maktordningar. För att möta detta görs stora investeringar varje år inom kommunal IT. Ändå är upplevelsen att ”det händer inget”. Syftet med forskningen är att presentera förståelser för abstrakta och komplexa ting som e-förvaltning, innovation (förändringsarbete), design och infrastruktur. Forskningen presenterar olika perspektiv för dessa ting och utmanar normerade förståelser för dem. Ett exempel på detta är hur infrastruktur med en normativ förståelse som abstrakt och passiv också visar sig vara påtagligt konkret och performativt. Målsättningen är sedan att dessa förståelser skall ligga till grund för att praktisera en IT- och e-förvaltningsutveckling, som inte känns ”glappande” eller stagnerad. Den forskning, som presenteras och prövas i denna avhandling, är situerad i en kommunkontext och bygger på deltagande aktionsforskning. Deltagande aktionsforskning innebär, att jag har försökt att reflexivt praktisera forskning i syfte att skapa kunskap och förändring tillsammans med andra och direkt i de samanhang jag har verkat. Aktionsforskningen innebär, att jag inte forskar om utan med olika kommunala praktiker. Denna typ av forskning är inte knuten till en specifik metod utan behöver anpassas och utvecklas i sitt sammanhang. I detta ingår en återkommande utmaning av befintlig verksamhetspraktik och förgivettaganden, inte minst mina egna. Licentiatavhandlingen, som är ”ett steg på vägen”, handlar mer om förståelser än om förklaringar. Avhandlingen är skriven som en form av sammanläggningsavhandling, som bygger på fyra artiklar och en delvis fristående kappa. Mitt skrivande och därmed den text, som utgör avhandlingen är ”essäistisk”. Essäformen har gjort det möjligt för mig att använda avhandlingsskrivandet som en fortsatt prövning av de olika perspektiv, som artiklarna presenterar.

  • 30. Diestelkamp, Wolfgang
    On Design Methodology for Flexible Systems2002Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    In software architecture, two seemingly contradicting goals are to be considered when designing systems, namely maintainability and performance aspects. It seems to be commonly accepted that systems can be designed either for flexibility and thus easily be maintained, or for performance, but generally not both at the same time. In this thesis we dexcribe blexibility and resulting maintainabiltiy from different aspects and show how it can eventually be achieved togehter with acceptable performance.

  • 31. Duong, Quang Trung
    On Cooperative Communications and Its Application to Mobile Multimedia2010Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    With the rapid growth of multimedia services, future generations of wireless communications require higher data rates and a more reliable transmission link while keeping satisfactory quality of service. In this respect, multipleinput multiple-output (MIMO) antenna systems have been considered as an efficient approach to address these demands by offering significant multiplexing and diversity gains over single-antenna systems without increasing requirements on radio resources such as bandwidth and power. Although MIMO systems can unfold their huge benefit in cellular base stations, they may face limitations when it comes to their deployment in mobile handsets. In particular, the typically small-size of mobile handsets makes it impractical to deploy multiple antennas. To overcome this drawback, the concept of cooperative communications has recently been proposed and gained large interest in the research community. The key idea is to form a virtual MIMO antenna array by utilizing a third terminal, a so-called relay node, which assists the direct communication. After receiving the source’s message, the relay processes and forwards it to the destination. With this approach, the benefits of MIMO systems can be attained in a distributed fashion. Furthermore, cooperative communications can efficiently combat the severity of fading and shadowing effects through the assistance of relay terminals. It has been shown that using the relay can extend the coverage of wireless networks. In this thesis, we focus on the performance evaluation of such cooperative communication systems and their application to mobile multimedia. The thesis is divided into five parts. In particular, the first part proposes a hybrid decode-amplify-forward (HDAF) relaying protocol which can significantly improve the performance of cooperative communication systems compared to the two conventional schemes of decode-and-forward (DF) and amplify-and-forward (AF). It is interesting to see that the performance gain of HDAF over DF and AF strictly depends on the relative value of channel conditions between the two hops. The second part extends HDAF to the case of multiple relays. It is important to note that the gains are saturated as the number of relays tends to be a large value. This observation motivates us to use a small number of relays to reduce network overhead as well as system complexity while the obtained gains are still as much as in the large-number case. In the third part, we analyze the performance of DF relaying networks with best relay selection over Nakagami-m fading channels. Besides the diversity gain, we show that the spatial multiplexing gain can be achieved by cooperative communications. We analyze the performance of cooperative multiplexing systems in terms of symbol error rate and ergodic capacity over composite fading channels. Finally, in the fifth part, we exploit the benefit of both diversity and multiplexing gain by proposing an unequal error transmission scheme for mobile multimedia services using cooperative communications.

  • 32. Dzamashvili-Fogelström, Nina
    Understanding and supporting requirements engineering decisions in market-driven software product development2010Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Context: Requirements engineering (RE) for software products offered to a mass market is concerned with deciding which of the diverse and large amounts of potential requirements to implement into future releases of a product. While being the key for achieving success, these decisions are very complex. Therefore, the need for decision support is well acknowledged. However, despite a growing body of research in this area, software companies are still experiencing problems in making informed decisions on which requirements to include in the software, or knowing how to maximize the potential ROI of a software release. Objective: The purpose of this thesis is to provide an increased understanding of how better decisions regarding the content of software products can be achieved. The research addresses two currently unresolved areas: balancing investments in different requirement types (commercial requirements, internal quality aspects and innovations) and identifying reasonable requirement analysis effort for informed requirements selection decisions. In order to address these areas the thesis focuses on investigating: 1) how uncertainty in the value proposition of a requirement is influencing the balance between investments in different requirement types; and 2) challenges and opportunities introduced by agile practices to RE decisions. Method: The presented research has an exploratory character and consists of empirical studies conducted both in industrial and academic settings. Results: The results include findings from an academic experiment and an industrial case study indicating that commercial requirements will be preferred over innovations and internal quality aspects. This is because innovations and internal quality aspects are associated with higher uncertainty in their value offering compared to commercial requirements and thereby are perceived to have higher level of business risk. The thesis also offers findings from an industrial case study, showing a misalignment between agile principles and the ability to take informed release planning decisions. Further, a framework (NORM) for finding an appropriate balance between information needs of RE decisions and requirements analysis effort is suggested. Conclusions: Uncertainty associated with the value proposition of different requirement types influences the requirements selection decisions, resulting in a dominance of commercial requirements. Thus, in order to achieve a better balance between investments in commercial requirements, internal quality and innovation it is important that uncertainty in the value offering of requirements is explicitly managed by methods providing support for RE decisions in a market-driven context. Agile methods provide opportunities to minimize overhead caused by excessive analysis of requirements, however adopting agile approaches in their current form pose challenges for performing product management and taking informed RE decisions in a market-driven context. Therefore, a balance between agility and information needs of RE decisions must be found. In combination, the thesis results offer a new insight and form a ground for defining improved approaches for supporting requirement selection decisions.

  • 33. Ecuru, Julius
    Fostering Growth in Uganda's Innovation System2011Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Research and innovation are key drivers for economic growth and competitiveness of countries. Of recent research and innovation-related initiatives have arisen in Uganda pointing to an evolving innovation system in the country and to the need to deepen understanding of the transformations taking place therein. This thesis provides evidence of this evolving innovation system in the country and makes recommendations to foster growth in it. A participatory research approach was employed using a combination of both qualitative and quantitative tools including key informant interviews and review of key policy documents, organizational reports and publications. Findings show that the role of research and innovation in driving economic growth and development was recognised in Uganda as early as the 1950s and 60s. But practical measures on how to integrate them into the national development planning process were lacking. It was not until the 1990s and 2000s that a realistic number of research and innovation initiatives started to emerge. These initiatives ranged from increased support to research, science policy development to supporting innovative business clusters. Arguably gains from these and other efforts would be enhanced, if government adopts a dual funding strategy for research and innovation, which on the one hand involves annual competitive grants and on the other hand increased core support to universities and research institutes. The public organizations create within them enabling conditions for creativity and enterprise development. The quality of education is improved at all levels to maintain a constant supply of a skilled scientific workforce. Ultimately, these efforts require inclusive innovation policies, which promote linkages and interactions between actors engaged in innovation processes both in country and abroad.

  • 34. Eivazzadeh, Shahryar
    Health Information Systems Evaluation2015Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Background

    Health information systems have emerged as a major component in our response to the trends of rising demands in health care. The insight being gained from the evaluation of those systems can critically influence the shaping of the response. Summative or formative evaluation of health information systems assesses their quality, acceptance, and usefulness, creates insight for improvement, discriminates between options, and refines future development strategies. But the evaluation of health information systems can be challenging due to the propagation of their impacts through multiple socio-technological layers till the ultimate recipients, their heterogeneity and fast evolvement, and the complexity of health care settings and systems.

    Aim

    This thesis tries to explain the challenges of evaluation of health information systems with a narrow down on determining evaluation aspects and to propose relevant solutions. The thesis goes for solutions that mitigate heterogeneity and incomparability, recruit or extend available evaluation models, embrace a wide context of application, and promote automation.

    Method

    The literature on health information systems evaluation, methods of dealing with heterogeneity in other disciplines of information systems, and ontology engineering were surveyed. Based on the literature survey, the UVON method, based on ontology engineering, was first developed in study 1. The method was applied in FI-STAR, a European Union project in e-Health with 7 use-cases, for summative evaluation of the individual and whole e-health applications. Study 2, extended the UVON method for a formative evaluation during the design phase.

    Results

    Application of the UVON method resulted in evaluation aspects that were delivered to the seven use-cases of the FI-STAR project in the form of questionnaires. The resulted evaluation aspects were considered sensible and with a confirming overlap with another highly used method in this field (MAST). No significant negative feedback from the FI-STAR use-case owners (n=7) or the respondents (n=87 patients and n=30 health professionals) was received or observed.

    Conclusion

    In the evaluation of health information systems --possibly also in other similarly characterized systems-- ontology engineering methods, such as the proposed UVON method, can be applied to create a flexible degree of unification across a heterogeneous set of evaluation aspects, import evaluation aspects from other evaluation methods, and prioritize between quality aspects in design phase. Ontologies, through their semantic network structures, can capture the extracted knowledge required for evaluation, facilitate computation of that knowledge, promote automation of evaluation, and accommodate further extensions of the related evaluation methods by adding new features to their network structure.

     

  • 35. Ekdahl, Peter
    Educative Moments. Rethorics and Realiteter2002Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The concepts learning processes, gestalting and digital technology are closely associated with most people’s personal experiences in life and thus constitute significant ideas about their hopes, dreams and life content. The concepts are context-dependent and ought therefore to be studied from several interdisciplinary perspectives. To this end and to limit the scope of my thesis, I have chosen to study them from two vantage points – from the perspective of the Blekinge region of Sweden and from a third-world perspective. The reason I have chosen these two angles is that studying these concepts on the local level provides the necessary closeness, while considering them in a third-world perspective provides the necessary distance. In order to illustrate the complexity of the concepts, I have chosen to study both the rhetorical level – the narratives, the dreams and the hopes – and people’s actual experiences, i.e. the realities, hence the dialectical title of the thesis. “The educative moment” is the rare moment when rhetorics and realities coincide. When traditions and changes together cause renewal on an individual or collective level, the conditions are created for human beings to be able to see themselves and their relations in a new light.

  • 36. Ekelin, Annelie
    Working with the Fogbow: Design and Reconfiguration of services and Participation in E-Government2003Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis is about the metaphors of the rainbow and the fogbow, investigations and evaluations, public Internet monitors, writing women, reflections and discussions about politics, design and democracy. It is also about the ongoing re-structuring of participation in service design within the development of E-Government. The aim behind the drive towards E-Government is to modernise administration and make it more efficient. The transformation and modernisation of public services are proclaimed to bring about a change in services based on a 'citizen-centred approach.' In such a process, communication between citizens and public authorities should play an essential role. Themes such as accountability, accessibility and participation all form part of the reconfiguration and at the same time these themes is shaped by the transformation. The papers in this thesis discuss, in different ways, how this reconfiguration is enacted in practice. Theories and methodologies from feminist theories, participatory design and informatics, are used in order to develop broader and more complex understandings of ongoing development within E-Government. Introduction to the papers Paper I Everyday Dialogue and Design for Co-Operative Use: An Evaluation of the Public Internet Monitor Project Accessibility is a central issue in the achievement of democracy, i.e. with respect to the opportunity for and right to 'access' to new technology and information – an argument also used when justifying the Public Internet Monitor Project. 'Access' in this context refers not only to purely physical access to new technology and information, it is also about the opportunity to take part in community business on several different levels. The present sub-report presents the project and its background. The paper also discusses the way in which the Public Internet Monitor Project as a whole has contributed to the development of a social interface or contact surface between citizens and public authorities, as well as how it has stimulated processes of change within public administration and in contacts between public authorities and citizens. Among the questions raised during the evaluation are; how local networks and activities can be stimulated by the citizen monitor and how the user's ideas and experience can be utilised in local adaptations so that they become an essential part of a continuous development of services and technology. The paper also describes the linked chains of responsibility exemplified in the excerpts from the interviews. These also include final users as a means of creating a personalised service adapted to local praxis and user environments. The question is posed "is it possible to talk in terms of interactivity on several different levels, not only in the sense of transmitting information or communicating, but also as a means of creating a relation-based interactivity?" Paper II Reconfiguration of Citizenship: Rights and Duties in Development of Public Services This paper presents the case of the cleaner in the library and some examples of feedback failures. Access to information, technology, and to some degree, participation in development of new services, is a central issue in the prevailing eGovernment discourse. This vision also comprises the idea of the active, contributing citizen and considers the development of local public participation as a process of co-construction of citizenship and services engaging several actors on different levels. At the same time, access must be seen as a contemporaneous process of inclusion and exclusion, a defining and drawing up of the boundaries of a new electronically mediated membership, where access is becoming a prerequisite for activating citizenship, transforming "the right to have access" into a "duty to participate", not just for citizens but for the employees who must manage the reconfiguration of citizenship and relations. The foundations for participation, however, turn out to be relatively restricted in practice. The original title of the paper was: Co-Construction of Citizenship: Rights and duties in development of public services. Paper III Consulting the Citizens – Relationship-based Interaction in the Development of E-Government This paper investigates current practices for involving citizens in the development of web-based services in public administration and tries to track their motives. With respect to democratisation, I argue that there is a large potential in adopting participatory design methods for establishing relation-based interaction between administration and citizens. The paper presents an analysis of E-Government initiatives. More particularly it explores the discourse of the materials surrounding these initiatives, particularly with respect to value systems derived from the marketing perspective contra democratic values. It demonstrates that conventional images of democracy have only a background role to play in such efforts. Paper IV Mapping Out and Constructing Needs in the Development of Online Public Services This paper is based on a study concerning experiences of, access to and requests for public services on-line, within the RISI+ Project. The paper presents a pilot study of the setting up of public services in the local context of the county of Blekinge, in southeast Sweden. The study was conducted as a peer evaluation of a selection of methods, or types of needs analysis, used by different actors and producers of public services in order to gain a picture of various needs among users. One part of this study focuses on the views expressed by service providers about the dialogue between themselves and citizens on the provision of public services. This is compared with the practical use or, in some cases, lack of use, of explicit techniques, such as questionnaires, larger surveys and work carried out with the help of focus groups. A basic question is, 'what role does citizen involvement play in the analysis of needs and services and in the choice of design?'. Parts of this report were presented in a poster display at the NordiCHI 2000 conference, "Design versus design" in Stockholm in October 2000 and, in a different version, as a work-in-progress report at the PDC 2000 (Participatory Design) Conference "Bringing in more voices" , in New York in November. Paper V Making E-Government Happen : Everyday Co-Development of Services, Citizenship and Technology This paper describes the use of a metaphorical figure used in different contexts as part of a discussion of working relationships of the co-development of services, citizenship and technology change. The paper discusses the challenge of developing a supportive infrastructure for the ongoing local adaptation and development of public services as citizens use them. Developing supportive structures for co-operation in the design task involves incorporating ways of including the general public, mapping out networks, developing tailorable software and cultivating shop-floor management. If continuous joint co-development of services is made a central part of the co-development of services, citizenship and technology, this also blurs the boundaries between governmental and municipal authorities, private sector employees and other actors within, for example, the voluntary sector - but above all, continuous joint co-development blurs the boundary between citizens and local authorities. The citizens become key figures in the 'web of connections' that makes up the design, content and use of new technologies. In the discourse on participation in E-Government, few reflections are made concerning the basic issue of the democratic values that could be gained by early involvement of local employees and citizens in developmental work or technology-based changes. Making more deliberate use of participatory design methods for incorporating multi-perspectives in service design as well as technology production and use could be a way to stimulate a broader, more inclusive and sustainable participation in local development of E-Government. Paper VI Discourses and Cracks - A Case Study of Information Technology and Writing Women in a Regional Context This is the first of the papers I wrote, where empirical material from a local IT project is discussed and mirrored against the dominating discourses of information technology. Paper VI discusses information technology as a political and practical discourse, which is in part shaped by the repetition of an exalted rhetoric. This repetitive discursive model can be distinguished in global, regional and local contexts and reflects an optimistic belief in technology as an independent power that automatically furthers democratic development. The second part of the paper presents empirical material and experiences from the Women Writing on the Net-project (this was included in the framework of the DIALOGUE project, which was partially funded by ISPO/EC). The aim of the project was to create a virtual space for women on the Internet, and to explore the writing process in terms of aims, tool and method. The method of approach incorporates reflections and discussions about empowerment, democracy and representation of women. This contributed to a more complex understanding of the values of the predominant IT discourses, and revealed the "cracks" in, and possibilities of feminist redefinitions of, these values.

  • 37.
    Elfsberg, Jenny
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Insert innovation: Strengthening the innovative capability of a large, mature firm2018Licentiate thesis, comprehensive summary (Other academic)
  • 38. Engelke, Ulrich
    Perceptual Quality Metric Design for Wireless Image and Video Communication2008Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The evolution of advanced radio transmission technologies for third generation mobile radio systems has paved the way for the delivery of mobile multimedia services. In particular, wireless image and video applications are among the most popular services offered on modern mobile devices to support communication beyond the traditional voice services. The large amount of data necessary to represent the visual content and the scarce bandwidth of the wireless channel impose new challenges for the network operator to deliver high quality image and video services. Link layer metrics have conventionally been used to monitor the received signal quality but were found to not accurately reflect the visual quality as it is perceived by the end-user. These metrics thus need to be replaced by suitable metrics that measure the overall impairments induced during image or video communication and accurately relate them to subjectively perceived quality. In this thesis, we focus on objective metrics that are able to quantify the end-to-end visual quality in wireless image and video communication. Such metrics may then be utilised to support the efficient use of link adaptation and resource management techniques and thus guarantee a certain quality of service to the user. The thesis is divided into four parts. The first part contributes an extensive survey and classification of contemporary image and video quality metrics that may be applicable in a communication context. The second part then discusses the development of the Normalised Hybrid Image Quality Metric (NHIQM) that we propose for prediction of visual quality degradations induced during wireless communication. The metric is based on a set of structural features, which are deployed to quantify artifacts that may occur in a wireless communication system and also are well aligned to characteristics of the human visual system (HVS). In the third part, three metric designs are discussed that utilise the same structural feature set as a basis for quality prediction. Incorporation of further HVS characteristics into the metric design will then improve even more the visual quality prediction performance. The design and validation of all proposed metrics is supported by subjective quality experiments that we conducted in two independent laboratories. Comparison to other state of the art visual quality metrics reveals the ability of the proposed metrics to accurately predict visual quality in a wireless communication system. The last part contributes an application of NHIQM for filter design. In particular, the filtering performance of a de-blocking de-ringing post filter for H.263 video sequences is analysed with regards to visual quality of the filtered sequence when applying appropriate filter parameter combinations.

  • 39. Englund, Thomas
    Dynamic characteristics of automobile exhaust system components2003Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Demands on emission control, and low vibration and noise levels have made the design of automobile exhaust systems a much more complex task over the last few decades. This, combined with increasing competition in the automobile industry, has rendered physical prototype testing impractical as the main support for design decisions. The aim of this thesis is to provide a deeper understanding of the dynamic characteristics of automobile exhaust system components to form a basis for improved design and the development of computationally inexpensive theoretical component models. Modelling, simulation and experimental investigation of a typical exhaust system are performed to gain such an understanding and evaluate ideas of component modelling. Modern cars often have a gas-tight bellows-type flexible joint between the manifold and the catalytic converter. This joint is given special attention since it is the most complex component from a dynamics point of view and because it is important for reducing transmission of engine movements to the exhaust system. The joint is non-linear if the bellows consists of multiple plies or if it includes an inside liner. The first non-linearity is shown to be weak and may therefore be neglected. The non-linearity due to friction in the liner is, however, highly significant and gives the joint complex dynamic characteristics. This is important to know of and consider in exhaust system design and proves the necessity of including a model of the liner in the theoretical joint model when this type of liner is present in the real joint to be simulated. It is known from practice and introductory investigations that also the whole system sometimes shows complex dynamic behaviour. This can be understood from the non-linear characteristics of the flexible joint shown in this work. An approach to the modelling of the combined bellows and liner joint is suggested and experimentally verified. It is shown that the exhaust system is essentially linear downstream of this joint. Highly simplified finite element models of the components within this part are suggested. These models incorporate adjustable flexibility in their connection to the exhaust pipes and a procedure is developed for automatic updating of these parameters to obtain better correlation with experimental results. The agreement between the simulation results of the updated models 5 and the experimental results is very good, which verifies the usability of these component models. A major conclusion is that in coming studies of how engine vibrations affect the exhaust system it may be considered as a linear system if the flexible joint consists of a bellows. If the joint also includes a liner, the system may be considered as a linear sub-system that is excited via a non-linear joint.

  • 40. Eriksson, Jeanette
    Bridging the Gap between Development and Use: Support of Tailorability in Software Evolution2005Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The intention of tailorable systems is to make it possible for end users to evolve an application to better fit altered requirements and tasks, and to make the system more endurable. This thesis discusses tailorable systems in the context of a rapidly changing business environment. The objective was to determine what is necessary for a tailorable business system to continuously adapt to expanding requirements and thereby live up to the intention of the system. The thesis includes five different studies, of which one is a literature study. The other four studies were conducted in three projects; one technical project exploring the possibility to use Metaobject Protocol in tailorable systems, one project in an explorative environment concerned with physical interfaces and one project, that also embraced user participation and user evaluation, regarded the possibility for end users to manage system infrastructure. The projects began with field studies (including participant observations and interviews) and workshops with users and developers. In each project, based on the outcome, an end-user tailorable prototype was developed. The prototypes were used for evaluating possibilities and problems with tailorable systems. Taken together the evaluations revealed what was required to make a tailorable system work as intended in a rapidly changing business environment. It could be concluded that tailoring is a good way to evolve a system to meet altered needs, because people who already possess the required domain knowledge can make changes quickly. Tailoring is not however enough, because the tailoring capabilities are always limited, meaning that tailoring cannot support completely unanticipated changes. In such cases the tailoring capabilities must be extended. Since such changes are only concerned with the system itself, and not the business task, it is hard to motivate even skilled users to make these types of changes. Tailoring activities must therefore be coordinated with software evolution activities performed by professional developers. This allows the system to adapt continuously to a rapidly changing business environment and thereby live up to the intention of the system. The final conclusion is that there is a need for close collaboration between end users, tailors and developers to make tailorable information systems adaptable to rapid changes in the business environment as well as being endurable. The collaboration has to be supported in the structure of the system by providing support for the work of users, tailors and developers.

  • 41.
    Erlandsson, Fredrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    On social interaction metrics: social network crawling based on interestingness2014Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    With the high use of online social networks we are entering the era of big data. With limited resources it is important to evaluate and prioritize interesting data. This thesis addresses the following aspects of social network analysis: efficient data collection, social interaction evaluation and user privacy concerns. It is possible to collect data from online social networks via their open APIs. However, a systematic and efficient collection of online social networks data is still challenging. To improve the quality of the data collection process, prioritizing methods are statistically evaluated. Results suggest that the collection time can be reduced by up to 48% by prioritizing the collection of posts. Evaluation of social interactions also require data that covers all the interactions in a given domain. This has previously been hard to do, but the proposed crawler is capable of extracting all social interactions from a given page. With the extracted data it is for instance possible to illustrate indirect interactions between different users that do not necessarily have to be connected. Methods using the same data to identify and cluster different opinions in online communities have been developed. These methods are evaluated with the too Linguistic Inquiry and Word Count. The privacy of the content produced; and the users’ private information provided on social networks is important to protect. Users must be aware of the consequence of posting in online social networks in terms of privacy. Methods to protect user privacy are presented. The proposed crawler in this thesis has, over the period of 20 months, collected over 38 million posts from public pages on Facebook covering: 4 billion likes and 340 million comments from over 280 million users. The performed data collection yielded one of the largest research dataset of social interactions on Facebook today, enabling qualitative research in form of social network analysis.

  • 42.
    Erman, David
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    BitTorrent Traffic Measurements and Models2005Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The Internet has experienced two major revolutions. The first was the emergence of the World Wide Web, which catapulted the Internet from being a scientific and academic network to becoming part of the societal infrastructure. The second revolution was the appearance of the Peer-to-Peer (P2P) applications, spear-headed by Napster. The popularity of P2P networking has lead to a dramatic increase of the volume and complexity of the traffic generated by P2P applications. P2P traffic has recently been shown to amount to almost 80% of the total traffic in a high speed IP backbone link. One of the major contributors to this massive volume of traffic is BitTorrent, a P2P replication system. Studies have shown that BitTorrent traffic more than doubled during the first quarter of 2004, and still amounts to 60% of all P2P traffic in 2005. This thesis reports on measurement, modelling and analysis of BitTorrent traffic collected at Blekinge Institute of Technology (BTH) as well as at a local ISP. An application layer measurement infrastructure for P2P measurements developed at BTH is presented. Furthermore, a dedicated fitness assessment method to avoid issues with large sample spaces is described. New results regarding BitTorrent session and message characteristics are reported and models for several important characteristics are provided. Results show that several BitTorrent metrics such as session durations and sizes exhibit heavy-tail behaviour. Additionally, previously reported results on peer reactivity to new content are corroborated.

  • 43.
    Erman, Maria
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Applications of Soft Computing Techniques for Wireless Communications2019Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis presents methods and applications of Fuzzy Logic and Rough Sets in the domain of Telecommunications at both the network and physical layers. Specifically, the use of a new class of functions, the truncated π functions, for classifying IP traffic by matching datagram size histograms is explored. Furthermore, work on adapting the payoff matrix in multiplayer games by using fuzzy entries as opposed to crisp values that are hard to quantify, is presented.

    Additionally, applications of fuzzy logic in wireless communications are presented, comprised by a comprehensive review of current trends and applications, followed by work directed towards using it in spectrum sensing and power control in cognitive radio networks.

    This licentiate thesis represents parts of my work in the fields of Fuzzy Systems and Wireless Communications. The work was done in collaboration between the Departments of Applied Signal Processing and Mathematics at Blekinge Institute of Technology.

  • 44. Fotrousi, Farnaz
    Quality-Impact Assessment of Software Products and Services in a Future Internet Platform2015Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The idea of a Future Internet platform is to deliver reusable and common functionalities to facilitate making wide ranges of software products and services.  The Future Internet platform, introduced by the Future Internet Public Private Partnership (FI-PPP) project, makes the common functionalities available through so-called Enablers to be instantly integrated into software products and services with less cost and complexity rather than a development from scratch.

    Quality assessment of software products and services and gaining insights into whether the quality fulfills users’ expectations within the platform are challenging. The challenges are due to the propagation of quality in the heterogeneous composite software that uses Enablers and infrastructure developed by third parties. The practical problem is how to assess the quality of such composite software as well as the impacts of the quality on users’ Quality of Experience (QoE).

    The research objective is to study an analytics-driven Quality-Impact approach identifying how software quality analytics together with their impact on QoE of users can be used for the assessment of software products and services in a Future Internet platform.

    The research was conducted with one systematic mapping study, two solution proposals, and one empirical study. The systematic mapping study is contributed to produce a map overviewing important analytics for managing a software ecosystem. The thesis also proposes a solution to introduce a holistic software-human analytics approach in a Future Internet platform. As the core of the solution, it proposes a Quality-Impact inquiry approach exemplified with a real practice. In the early validation of the proposals, a mixed qualitative-quantitative empirical research is conducted with the aim of designing a tool for the inquiry of user feedback. This research studies the effect of the instrumented feedback tool on QoE of a software product.

    The findings of the licentiate thesis show that satisfaction, performance, and freedom from risks analytics are important groups of analytics for assessing software products and services.  The proposed holistic solution takes up the results by describing how to measure the analytics and how to assess them practically using a composition model during the lifecycle of products and services in a Future Internet platform. As the core of the holistic approach, the Quality-Impact assessment approach could elicit relationships between software quality and impacts of the quality on stakeholders. Moreover, the early validation of the Quality-Impact approach parameterized suitable characteristics of a feedback tool. We found that disturbing feedback tools have negligible impacts on the perceived QoE of software products.

    The Quality-Impact approach is helpful to acquire insight into the success of software products and services contributing to the health and sustainability of the platform. This approach was adopted as a part of the validation of FI-PPP project. Future works will address the validation of the Quality-Impact approach in the FI-PPP or other real practices.

  • 45. Fredin, Johan
    Modelling, Simulation and Optimisation of a Machine Tool2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    To be competitive in today’s global market it is of great importance that product development is done in an effective and efficient way. To enhance functionality, modern products are often so-called mechatronic systems. This puts even higher demands on the product development work due to the complexity of such products. Simulation and optimisation have been proven to be efficient tools to support the product development process. The aim of this thesis is to study how the properties of mechatronic products can be efficiently and systemically predicted, described, assessed and improved in product development. An industrial case study of a water jet cutting machine investigates how simulation models and optimisation strategies can be efficiently developed and used to enhance functionality, flexibility and performance of mechatronic products. The knowledge gained from the case study is shown to be useful for companies developing machine tools. Most likely it is also useful for developers of other mechatronic products. The thesis shows that with the presented optimisation strategies, comprising a mix of different computerised optimisation algorithms and more classical engineering work, design problems with a large amount of design variables can be solved efficiently. A specific result is a validated simulation model for simulation and optimisation of a water jet cutting machine. As all mechatronic disciplines of the machine tool are considered simultaneously, synergetic effects can be utilised. Optimisation studies show a significant potential for improving manufacturing accuracy, for manufacturing speed and for a more light-weight design. Carrying out simulation and optimisation has also provided a great amount of information about the studied system, potentially useful in coming product development work. By reducing the number of physical prototypes through simulation and optimisation, the resource consumption during product development is reduced. Also, with more optimised products the resource consumption can be significantly reduced throughout the whole use phase. These benefits support the competitiveness of the product developing company as well as a sustainable development of society as a whole.

  • 46.
    García-Martín, Eva
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Extraction and Energy Efficient Processing of Streaming Data2017Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The interest in machine learning algorithms is increasing, in parallel with the advancements in hardware and software required to mine large-scale datasets. Machine learning algorithms account for a significant amount of energy consumed in data centers, which impacts the global energy consumption. However, machine learning algorithms are optimized towards predictive performance and scalability. Algorithms with low energy consumption are necessary for embedded systems and other resource constrained devices; and desirable for platforms that require many computations, such as data centers. Data stream mining investigates how to process potentially infinite streams of data without the need to store all the data. This ability is particularly useful for companies that are generating data at a high rate, such as social networks.

    This thesis investigates algorithms in the data stream mining domain from an energy efficiency perspective. The thesis comprises of two parts. The first part explores how to extract and analyze data from Twitter, with a pilot study that investigates a correlation between hashtags and followers. The second and main part investigates how energy is consumed and optimized in an online learning algorithm, suitable for data stream mining tasks.

    The second part of the thesis focuses on analyzing, understanding, and reformulating the Very Fast Decision Tree (VFDT) algorithm, the original Hoeffding tree algorithm, into an energy efficient version. It presents three key contributions. First, it shows how energy varies in the VFDT from a high-level view by tuning different parameters. Second, it presents a methodology to identify energy bottlenecks in machine learning algorithms, by portraying the functions of the VFDT that consume the largest amount of energy. Third, it introduces dynamic parameter adaptation for Hoeffding trees, a method to dynamically adapt the parameters of Hoeffding trees to reduce their energy consumption. The results show an average energy reduction of 23% on the VFDT algorithm.

  • 47.
    Ghazi, Nauman
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Testing of Heterogeneous Systems2014Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Context: A system of systems often exhibits heterogeneity, for instance in implementation, hardware, process and verification. We define a heterogeneous system, as a system comprised of multiple systems (system of systems) where at least one subsystem exhibits heterogeneity with respect to the other systems. The system of systems approach taken in development of heterogeneous systems give rise to various challenges due to continuous change in configurations and multiple interactions between the functionally independent subsystems. The challenges posed to testing of heterogeneous systems are mainly related to interoperability, conformance and large regression test suites. Furthermore, the inherent complexities of heterogeneous systems also pose challenge to the specification, selection and execution of tests. Objective: The main objective of this licentiate thesis is to provide an insight on the state of the art in testing heterogeneous systems. Moreover, we also aimed to investigate different test techniques used to test heterogeneous systems in industrial settings and their usefulness as well as to identify and prioritize different information sources that can help practitioners to define a generic search space for test case selection process. Method: The findings presented in this thesis are obtained through a controlled experiment, a systematic literature review (SLR), a case study and an exploratory survey. The purpose of systematic literature review was to investigate the existing state of art in testing heterogeneous systems and identification of research gaps. The results from the SLR further laid down the foundation of action research conducted through an exploratory survey to compare different test techniques. We also conducted an industrial case study to investigate the relevant data sources for search space initiation to prioritize and specify test cases in context of heterogeneous systems. Results: Based on our literature review, we found that testing of heterogeneous systems is considered a problem of integration and system testing. It has been observed that multiple interactions between the system and subsystems results into a testing challenge, especially when the configurations change continuously. It is also observed that current literature targets the problem of testing heterogeneous systems with multiple test objectives resulting in employing different test methods to reach a domain specific testing challenge. Using the exploratory survey, we found three test techniques to be most relevant in context of testing heterogeneous systems. However, the most frequently used technique mentioned by the practitioners is manual exploratory testing which is not a much researched topic in the context of heterogeneous systems. Moreover, multiple information sources for test selection process are identified through the case study and the survey. Conclusion: Companies engaged in development of heterogeneous systems encounter huge challenges due to multiple interactions between the system and subsystems. However, the conclusions we draw from the research studies included herein show a gap between literature and industry. Search-based testing is widely discussed in the literature but is the least used test technique in industrial practice. Moreover, for test selection process there are no frameworks that take in account all the information sources that we investigated. Therefore, to fill this gap there is a need for an optimized test selection process based on the information sources. There is also a need to study different test techniques identified through our SLR and survey and compare these techniques on real heterogeneous systems.

  • 48. Giger, Peter
    Participation Literacy: Part I: Constructing the Web 2.0 Concept2006Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The licentiate thesis is a piece of academic work under the theme of /Participation Literacy/. The thesis concerns the Web 2.0 concept construction. Web 2.0 is a new mind-set on the Internet. The main characteristics include ”Web as a Platform”, Collective Intelligence, Folksonomy and interfaces build with lightweight technologies such as Ajax. Web 2.0 is not only a technique, but also an ideology – an ideology of participation. A Web 2.0 service is completely web based and generally draws on open access. It includes tools for people to interact within areas such as encyclopaedias, bookmarks, photos, books or research articles. All Web 2.0 services are web communities. A web community is a group of individuals, linked together by a network of social relations with some degree of continu¬ity. Community members learn from each other and the knowledge base of the community grows for every interaction. The core values of Web 2.0 are democracy and participation. The licentiate thesis is divided into four main parts and two appendixes. The four parts constitute a foreword, a reading guide, a conceptual and empirical introductory discussion to the Web 2.0 concept; finally a series of constructions based on the Web 2.0 concept and the cyborg figure. Appendix I is a short conference paper called Technologically Navigating Cyborgs. Appendix II is a very short piece of fiction, written in Swedish. These appendixes comprise a background to the focus on the Web 2.0 and the cyborg concept.

  • 49. Gorschek, Tony
    Software Process Assessment & Improvement in Industrial Requirements Engineering2004Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Requirements Engineering (RE) is a crucial part of any product management and product development activity, and as such deficiencies in the RE process may have severe consequences. There are reports from industry that point towards inadequate requirements being one of the leading sources for project failure. Software Process Improvement (SPI) is generally seen as the main tool to address process deficiencies in general and within RE. Assessments lead to establishing plans for improvements that are subsequently implemented and evaluated, and then the SPI cycle starts again, in an optimal case being incremental and continuous. Most well known SPI frameworks, e.g. CMM, CMMI, SPICE and QIP, are based on these general principles. There are however several factors that can have a negative impact on SPI efforts in general, and in the case of SPI targeted at RE in particular. Time and cost are two fundamental factors that can effectively “raise the bar” for SPI efforts being initiated at all. This is the particular case for Small and Medium sized Enterprises (SMEs) with limited resources, and a limited ability to wait for the return on their investment. Other issues include commitment and involvement in the SPI work by the ones affected by the changes, coverage of the RE area in SPI frameworks, and the ability to focus improvements to areas where they are needed the most. The research presented in this thesis is based on actual needs identified in industry, and all of the proposed solutions have also been validated in industry to address issues of applicability and usability. In general, the goal of the research is to “lower the bar”, i.e. enabling SMEs to initiate and perform SPI activities. It is accomplished through the presentation and validation of two assessment methods that targets RE, one aimed at both fast and low-cost benchmarking of current practices, and the other designed to produce tangible improvement proposals that can be used as input to an improvement activity, i.e. producing a relatively accurate assessment but taking limited time and resources into account. Further, to offer a structured way in which SMEs can focus their SPI efforts, a framework is introduced that can be used to package improvement proposals with regards to their relative priority taking dependencies into account. This enables SMEs to choose what to do first based on their needs, as well as a way to control time to return on their investment by controlling the size of the undertaking. As a result of industry validation of the assessment method and packaging framework, several improvement proposals were identified and prioritized/packaged. As a part of a process improvement effort (based on an improvement proposal package) an RE model was developed that was appropriate for SMEs faced with a market-driven product centered development situation. The model, called Requirements Abstraction Model (RAM), addresses the structuring and specification of requirements. The main feature of the model is that it not only offers a structured way in which requirements can be specified, but it also takes a requirement’s abstraction level into account, using abstraction for the work-up instead of putting all requirements in one repository independent of abstraction level. The RAM was developed to support primarily the product management effort, recognizing that RE from this perspective is not project initiated but rather project initiating. The model assists product managers to take requirements on varying abstraction levels and refining them to the point of being good-enough to offer decision support for management, and at the same time being good-enough for project initiation. The main contribution of the thesis is to present SMEs with “tools” that help them commit to and perform SPI activities. Moreover, the thesis introduces the RAM model that was developed based on needs identified in industry, and subsequently piloted in industry to assure usability.

  • 50.
    Gould, Rachael
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    Integrating sustainability into concept selection decision-making2015Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The audience for this research is fellow researchers and others helping product developers to start including sustainability when they are selecting product concepts.

    The aims of the research were to understand the needs of product developers integrating sustainability into concept selection and what might be done to help them.

    The research approach was to iterate between the three studies of design research methodology. The first study focused on understanding the challenges that product developers face when integrating sustainability into concept selection. The aim of the second study was to identify potential support to help product developers to deal with the challenges.  And the third study was to try out the potential support to see if it actually helps product developers address the challenges they face. These studies were executed through reviewing literature and exploring two cases.

    The results led to a focus on supporting the decision-making process and supporting analysing with  respect to social sustainability.  Selecting concepts is a complex decision made under challenging conditions. Bringing in the complex, new and unfamiliar aspects of sustainability can make good decision-making even more challenging. When integrating sustainability, two particular barriers to good concept selection decision-making are errors due to illusory correlation and confirmation bias.

    Despite the challenges, how good you are at making decisions matters. And a good decision-making process drives good decisions. This is especially relevant when bringing in complex and unfamiliar aspects, such as sustainability.  A likely candidate for helping product developers achieve a good decision-making process when integrating sustainability is active, value-focused decision-support. In other words, structuring the process into bite-sized steps and using particular techniques to avoid bias. At each step, decision-makers’ focus is anchored by the things that stakeholders value as important.   Further research is required to investigate the details of how to employ these process-support approaches in the particular context of integrating sustainability into concept selection decision-making.

    In addition to a process, complicated selection decisions demand analysis. Support for analysing concepts with respect to social sustainability was identified as a gap. We explored a potential approach that might contribute to this analysis, but found that it was not useful for the particular decision in hand.  This opened up some interesting questions for further research.

1234 1 - 50 of 185
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf