Change search
Refine search result
3456789 251 - 300 of 4798
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 251. Arvidsson, Åke
    et al.
    Hederstierna, Anders
    Hellmer, Stefan
    Simple and Accurate Forecasting of the Market for Cellular Mobile Services2007In: Managing Traffic Performance in Converged Networks, Berlin: Springer , 2007Chapter in book (Refereed)
    Abstract [en]

    We consider the problems of explaining and forecasting the penetration and the traffic in cellular mobile networks. To this end, we create two regression models, viz. one to predict the penetration from service charges and network effects and another one to predict the traffic from service charges and diffusion and adoption effects. The results of the models can also be combined to compute the likely evolutions of essential characteristics such as Minutes of Use (MoU), Average Revenue per User (ARPU) and total revenue. Applying the models to 26 markets throughout the world we show that they perform very well. Noting the significant qualitative differences between these markets, we conclude that the model has some universality in that the results are comparable for all of them.

  • 252.
    Arvola Bjelkesten, Kim
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Feasibility of Point Grid Room First Structure Generation: A bottom-up approach2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Context. Procedural generation becomes increasingly important for videogames in an age where the scope of the content required demands bot a lot of time and work. One of the fronts of this field is structure generation where algorithms create models for the game developers to use. Objectives. This study aims to explore the feasibility of the bottom-up approach within the field of structure generation for video games. Methods. Developing an algorithm using the bottom-up approach, PGRFSG, and utilizing a user study to prove the validity of the results. Each participant evaluates five structures giving them a score based on if they belong in a video game. Results. The participants evaluations show that among the structures generated were some that definitely belonged in a video game world. Two of the five structures got a high score though for one structure that was deemed as not the case. Conclusions. A conclusion can be made that the PGRFSG algorithm creates structures that belong in a video game world and that the bottom-up approach is a suitable one for structure generation based on the results presented.

  • 253.
    Aryal, Dhiraj
    et al.
    Blekinge Institute of Technology, School of Computing.
    Shakya, Anup
    Blekinge Institute of Technology, School of Computing.
    A Taxonomy of SQL Injection Defense Techniques2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: SQL injection attack (SQLIA) poses a serious defense threat to web applications by allowing attackers to gain unhindered access to the underlying databases containing potentially sensitive information. A lot of methods and techniques have been proposed by different researchers and practitioners to mitigate SQL injection problem. However, deploying those methods and techniques without a clear understanding can induce a false sense of security. Classification of such techniques would provide a great assistance to get rid of such false sense of security. Objectives: This paper is focused on classification of such techniques by building taxonomy of SQL injection defense techniques. Methods: Systematic literature review (SLR) is conducted using five reputed and familiar e-databases; IEEE, ACM, Engineering Village (Inspec/Compendex), ISI web of science and Scopus. Results: 61 defense techniques are found and based on these techniques, a taxonomy of SQL injection defense techniques is built. Our taxonomy consists of various dimensions which can be grouped under two higher order terms; detection method and evaluation criteria. Conclusion: The taxonomy provides a basis for comparison among different defense techniques. Organization(s) can use our taxonomy to choose suitable owns depending on their available resources and environments. Moreover, this classification can lead towards a number of future research directions in the field of SQL injection.

  • 254.
    Asghar, Gulfam
    et al.
    Blekinge Institute of Technology, School of Computing.
    Azmi, Qanit Jawed
    Blekinge Institute of Technology, School of Computing.
    Security Issues of SIP2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Voice over IP (VoIP) services based on Session Initiation Protocol (SIP) has gained much attention as compared to other protocols like H.323 or MGCP over the last decade. SIP is the most favorite signaling protocol for the current and future IP telephony services, and it‘s also becoming the real competitor for traditional telephony services. However, the open architecture of SIP results the provided services vulnerable to different types of security threats which are similar in nature to those currently existing on the Internet. For this reason, there is an obvious need to provide some kind of security mechanisms to SIP based VOIP implementations. In this research, we will discuss the security threats to SIP and will highlight the related open issues. Although there are many threats to SIP security but we will focus mainly on the session hijacking and DoS attacks. We will demonstrate these types of attacks by introducing a model/practical test environment. We will also analyze the effect and performance of some the proposed solutions that is the use of Network Address Translation (NAT), IPSec, Virtual Private Networks (VPNs) and Firewalls (IDS/IPS) with the help of a test scenario.

  • 255.
    Asghari, Negin
    Blekinge Institute of Technology, School of Computing.
    Evaluating GQM+ Strategies Framework for Planning Measurement System2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context. Most organizations are aware of the significance of software measurement programs to help organizations assess and improve the ways they develop software. Measurement plays a vital role in improving software process and products. However, the number of failing measurement programs is high and the reasons are vary. A recent approach for planning measurement programs is GQM+Strategies, which makes an important extension to existing approaches, it links measurements and improvement activities to strategic goals and ways to achieve this goals. However, concrete guides of how to collect the information needed to use GQM+strategies is not provided in the literature yet. Objectives. The contribution of this research is to propose and assess an elicitation approach (The Goal Strategy Elicitation (GSE) approach) for the information needed to apply GQM+strategies in an organization, which also leads to a partial evaluation of GQM+strategies as such. In this thesis, the initial focus is placed on eliciting the goals and strategies in the most efficient way. Methods. The primary research approach used is action research, which allows to flexibly assess a new method or technique in an iterative manner, where the feedback of one iteration is taken into the next iteration, thus improving on the method or technique proposed. Complementary to that, we used literature review with the primary focus to position the work, explore GQM+strategies, and to determine which elicitation approach for the support of measurement programs have been proposed. Results. The Goal Strategy Elicitation (GSE) approach as a tool for eliciting goals and strategies within the software organization to contribute in planning a measurement program has been developed. The iterations showed that the approach of elicitation may not be too structured (e.g. template/notation based), but rather shall support the stakeholders to express their thoughts relatively freely. Hence, the end-result was an interview guide, not based on notations (as in the first iteration), and asking questions in a way that the interviewees are able to express themselves easily without having to e.g. distinguish definitions for goals and strategies. Conclusions. We conclude that the GSE approach is a strong tool for the software organization to be able to elicit the goals and strategies to support GQM+Strategies. GSE approach evolved in each iteration and the latest iteration together with the guideline is still used within the studied company for eliciting goals and strategies, and the organization acknowledged that they will continue to do so. Moreover, we conclude that there is a need for further empirical validation of the GSE approach in further full-scale industry trials.

  • 256.
    Ashfaq, Rana Aamir Raza
    et al.
    Blekinge Institute of Technology, School of Computing.
    Khan, Mohammad Qasim
    Blekinge Institute of Technology, School of Computing.
    Analyzing Common Criteria Shortcomings to Improve its Efficacy2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Information security has become a key concern for organizations conducting business in the current electronic era. Rapid technological development continuously creates novel security threats, making IT an uncertain infrastructure. So the security is an important factor for the vendors as well as for the consumers. To fulfill the security needs, IT companies have to adopt some standards to assure some levels that concern with the security in their product. Common Criteria (CC) is one of the standards that maintains and controls the security of IT products. Many other standards are also available to assure the security in products but like these standards CC has its own pros and cons. It does not impose predefined security rules that a product should exhibit but a language for security evaluation. CC has certain advantages due to its ability to address all the three dimensions: a) it provides opportunity for users to specify their security requirements, b) an implementation guide for the developers and c) provides comprehensive criteria to evaluate the security requirements. On the downside, it requires considerable amount of resources and is quite time consuming. Another is security requirements that it evaluates and must be defined before the project start which is in direct conflict with the rapidly changing security threat environment. In this research thesis we will analyze the core issues and find the major causes for the criticism. Many IT users in USA and UK have reservations with CC evaluation because of its limitations. We will analyze the CC shortcomings and document them that will be useful for researchers to have an idea of shortcomings associated with CC. This study will potentially be able to strengthen the CC usage with a more effective and responsive evaluation methodology for IT community.

  • 257.
    Ashraf, Imran
    et al.
    Blekinge Institute of Technology, School of Computing.
    Khokhar, Amir Shahzed
    Blekinge Institute of Technology, School of Computing.
    Principles for Distributed Databases in Telecom Environment2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Centralized databases are becoming bottleneck for organizations that are physically distributed and access data remotely. Data management is easy in centralized databases. However, it carries high communication cost and most importantly high response time. The concept of distributing the data over various locations is very attractive for such organizations. In such cases the database is fragmented into fragments and distributed to the locations where it is needed. This kind of distribution provides local control of data and the data access is also very fast in such databases. However, concurrency control, query optimization and data allocations are the factors that affect the response time and must be investigated prior to implementing distributed databases. This thesis makes the use of mixed method approach to meet its objective. In quantitative section, we performed an experiment to compare the response time of two databases; centralized and fragmented/distributed. The experiment was performed at Ericsson. A literature review was also done to find out other important response time related issues like query optimization, concurrency control and data allocation. The literature review revealed that these factors can further improve the response time in distributed environment. Results of the experiment showed a substantial decrease in the response time due to the fragmentation and distribution.

  • 258.
    Asif, Sajjad
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Investigating Web Size Metrics for Early Web Cost Estimation2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context Web engineering is a new research field which utilizes engineering principles to produce quality web applications. Web applications have become more complex with the passage of time and it's quite difficult to analyze the web metrics for the estimation due to a wide range of web applications. Correct estimates for web development effort play a very important role in the success of large-scale web development projects.

    Objectives In this study I investigated size metrics and cost drivers used by web companies for early web cost estimation. I also aim to get validation through industrial interviews and web quote form. This form is designed based on most frequently occurring metrics after analyzing different companies. Secondly, this research aims to revisit previous work done by Mendes (a senior researcher and contributor in this research area) to validate whether early web cost estimation trends are same or changed? The ultimate goal is to help companies in web cost estimation.

    Methods First research question is answered by conducting an online survey through 212 web companies and finding their web predictor forms (quote forms). All companies included in the survey used Web forms to give quotes on Web development projects based on gathered size and cost measures. The second research question is answered by finding most occurring size metrics from the results of Survey 1. List of size metrics are validated by two methods: (i) Industrial interviews are conducted with 15 web companies to validate results of the first survey (ii) a quote form is designed using validated results from industrial interviews and quote form sent to web companies around the world to seek data on real Web projects. Data gathered from Web projects are analyzed using CBR tool and results are validated with Industrial interview results along with Survey 1.  Final results are compared with old research to justify answer of third research question whether size metrics have been changed. All research findings are contributed to Tukutuku research benchmark project.

    Results “Number of pages/features” and “responsive implementation” are top web size metrics for early Web cost estimation.

    Conclusions. This research investigated metrics which can be used for early Web cost estimation at the early stage of Web application development. This is the stage where the application is not built yet but just requirements are being collected and an expected cost estimation is being evaluated. List of new metrics variable is concluded which can be added in Tukutuku project.

  • 259.
    Asim, Muhammad Ahsan
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Network Testing in a Testbed Simulator using Combinatorial Structures2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This report covers one of the most demanding issues of network users i.e. network testing. Network testing in this study is about performance evaluation of networks, by putting traffic load gradually to determine the queuing delay for different traffics. Testing of such operations is becoming complex and necessary due to use of real time applications such as voice and video traffic, parallel to elastic data of ordinary applications over WAN links. Huge size elastic data occupies almost 80% resources and causes delay for time sensitive traffic. Performance parameters like service outage, delay, packet loss and jitter are tested to assure the reliability factor of provided Quality of Service (QoS) in the Service Level Agreements (SLAs). Normally these network services are tested after deployment of physical networks. In this case most of the time customers have to experience unavailability (outage) of network services due to increased levels of load and stress. According to user-centric point of view these outages are violation and must be avoided by the net-centric end. In order to meet these challenges network SLAs are tested on simulators in lab environment. This study provides a solution for this problem in a form of testbed simulator named Combinatorial TestBed Simulator (CTBS). Prototype of this simulator is developed for conducting experiment. It provides a systematic approach of combinatorial structures for finding such traffic patterns that exceeds the limit of queuing delay, committed in SLAs. Combinatorics is a branch of mathematics that deals with discrete and normally finite elements. In the design of CTBS, technique of combinatorics is used to generate a variety of test data that cannot be generated manually for testing the given network scenario. To validate the design of CTBS, results obtained by pilot runs are compared with the results calculated using timeline. After validation of CTBS design, actual experiment is conducted to determine the set of traffic patterns that exceeds the threshold value of queuing delay for Voice over Internet Protocol (VOIP) traffic.

  • 260.
    Ask, Anna Vikström
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Reasons for fire fighting in projects2003Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This work is a study examining the causes of fire fighting in software projects. Fire fighting is the practice of reactive management, i.e. focus being put at solving the problem of the moment. The study in the thesis is performed in two parts, one part is a literature study examining what academia considers as the reasons of fire fighting and how to minimise the problem. The other part of the thesis is an interview series performed in the industry with the purpose of finding what they consider the causes of the fire fighting phenomena. What is indicated by the interview series, as being the main causes of the problems are problems that are related to requirements, and problems caused by persons with key knowledge leaving the project.

  • 261.
    Asklund, Ulf
    et al.
    Lund University, SWE.
    Höst, Martin
    Lund University, SWE.
    Wnuk, Krzysztof
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Experiences from Monitoring Effect of Architectural Changes2016In: Software Quality.: The Future of Systems- and Software Development / [ed] Winkler, Dietmar, Biffl, Stefan, Bergsmann, Johannes, 2016, p. 97-108Conference paper (Refereed)
    Abstract [en]

    A common situation is that an initial architecture has been sufficient in the initial phases of a project, but when the size and complexity of the product increases the architecture must be changed. In this paper experiences are presented from changing an architecture into independent units, providing basic reuse of main functionality although giving higher priority to independence than reuse. An objective was also to introduce metrics in order to monitor the architectural changes. The change was studied in a case-study through weekly meetings with the team, collected metrics, and questionnaires. The new architecture was well received by the development team, who found it to be less fragile. Concerning the metrics for monitoring it was concluded that a high abstraction level was useful for the purpose.

  • 262.
    Askwall, Niklas
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    Utvärderingsmetod Säkerhetskultur: Ett första steg i en valideringsprocess2013Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Företag investerar idag väldigt mycket pengar på att säkra sina fysiska och logiska tillgångar med hjälp av tekniska skyddsmekanismer. Dock är all säkerhet på något sätt beroende av den enskilde individens omdöme och kunskap. Hur går det avgöra att organisationen kan lita på individens omdöme och kunskap? Hur går det avgöra om en organisation har en god kultur kring säkerhet? Genom att utvärdera säkerhetskulturen kan organisationer få ett utökat underlag i riskhanteringsarbetet samt en bättre förmåga att hantera det som hotar verksamhetens tillgångar. Den forskning som finns idag på området säkerhetskultur är både oense kring vad som utgör god säkerhetskultur men framför allt hur kulturen ska utvärderas. Denna forskningsansats är således ett försök att ta fram en intuitiv utvärderingsmetod som organisationer kan använda för att utvärdera sin säkerhetskultur. Utvärderingsmetoden liknar en gap-analys där en organisations önskade kultur fastställs och datainsamling sker genom en enkätundersökning. Dataunderlaget sammanställs och används för att skapa ett index för den rådande kulturen i jämförelse med den önskade kulturen. I detta inledande försök testas undersökningens reliabilitet genom Cronbach's alpha och validiteten testas genom en form av konfirmatorisk faktoranalys. Resultatet visar hur ett index som representerar en organisations säkerhetskultur skapas. Det går att påvisa god reliabilitet på utvärderingsmetoden och författaren finner goda argument för nyttan av en sådan metod i det proaktiva säkerhetsarbetet. Dock har omständigheter gjort det mycket svårt att påvisa god validitet i denna inledande undersökning.

  • 263.
    Asl, Babak Ghafary
    Blekinge Institute of Technology, School of Engineering.
    A Computer Aided Detection System for Cerebral Microbleeds in Brain MRI2012Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Advances in MR technology have improved the potential for visualization of small lesions in brain images. This has resulted in the opportunity to detect cerebral microbleeds (CMBs), small hemorrhages in the brain that are known to be associated with risk of ischemic stroke and intracerebral bleeding. Currently, no computerized method is available for fully- or semi-automated detection of CMBs. In this paper, we propose a CAD system for the detection of CMBs to speed up visual analysis in population-based studies. Our method consists of three steps: (i) skull-stripping (ii) initial candidate selection (iii) reduction of false-positives using a two layer classi cation and (iv) determining the anatomical location of CMBs. The training and test sets consist of 156 subjects (448 CMBs) and 81 subjects (183 CMBs), respectively. The geometrical, intensity-based and local image descriptor features were used in the classi cation steps. The training and test sets consist of 156 subjects (448 CMBs) and 81 subjects (183 CMBs), respectively. The sensitivity for CMB detection was 90% with, on average, 4 false-positives per subject.

  • 264.
    Aslam, Khurum
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Khurum, Mahvish
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    A Model for Early Requirements Triage and Selection Utilizing Product Strategies2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In market-driven product development, large numbers of requirements flow in continuously. It is critical for product management to select the requirements aligned with overall business goals and discard others as early as possible. It has been suggested in literature to utilize product strategies for early requirements triage and selection. However, no explicit method/model/framework has been suggested as how to do it. This thesis presents a model for early requirements triage and selection utilizing product strategies based on a literature study and interviews with people at two organizations about the requirements triage and selection processes and product strategies formulation. The model is validated statically within the same two organizations.

  • 265. Aspvall, Bengt
    et al.
    Halldorsson, MM
    Manne, F
    Approximations for the general block distribution of a matrix2001In: Theoretical Computer Science, ISSN 0304-3975, E-ISSN 1879-2294, Vol. 262, no 1-2, p. 145-160Article in journal (Refereed)
    Abstract [en]

    The general block distribution of a matrix is a rectilinear partition of the matrix into orthogonal blocks such that the maximum sum of the elements within a single block is minimized. This corresponds to partitioning the matrix onto parallel processors so as to minimize processor load while maintaining regular communication patterns. Applications of the problem include various parallel sparse matrix computations, compilers for high-performance languages, particle in cell computations, video and image compression, and simulations associated with a communication network. We analyze the performance guarantee of a natural and practical heuristic based on iterative refinement, which has previously been shown to give good empirical results. When p2 is the number of blocks, we show that the tight performance ratio is Theta(rootp). When the matrix has rows of large cost, the details of the objective function of the algorithm are shown to be important, since a naive implementation can lead to a Ohm (p) performance ratio. Extensions to more general cost functions, higher-dimensional arrays, and randomized initial configurations are also considered. (C) 2001 Elsevier Science B.V. All rights reserved.

  • 266. Aspvall, Bengt
    et al.
    Pettersson, Eva
    Från datorernas värld2007In: Nämnaren, ISSN 0348-2723 , Vol. 34, no 2, p. 44-48Article in journal (Refereed)
  • 267. Astor, Philipp
    et al.
    Adam, Marc
    Jerčić, Petar
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Schaaff, Kristina
    Weinhardt, Christof
    Integrating biosignals into information systems: A NeuroIS tool for improving emotion regulation2013In: Journal of Management Information Systems, ISSN 0742-1222, E-ISSN 1557-928X, Vol. 30, no 3, p. 247-277Article in journal (Refereed)
    Abstract [en]

    Traders and investors are aware that emotional processes can have material consequences on their financial decision performance. However, typical learning approaches for debiasing fail to overcome emotionally driven financial dispositions, mostly because of subjects' limited capacity for self-monitoring. Our research aims at improving decision makers' performance by (1) boosting their awareness to their emotional state and (2) improving their skills for effective emotion regulation. To that end, we designed and implemented a serious game-based NeuroIS tool that continuously displays the player's individual emotional state, via biofeedback, and adapts the difficulty of the decision environment to this emotional state. The design artifact was then evaluated in two laboratory experiments. Taken together, our study demonstrates how information systems design science research can contribute to improving financial decision making by integrating physiological data into information technology artifacts. Moreover, we provide specific design guidelines for how biofeedback can be integrated into information systems

  • 268.
    Ataeian, Seyed Mohsen
    et al.
    Blekinge Institute of Technology, School of Computing.
    Darbandi, Mehrnaz Jaberi
    Blekinge Institute of Technology, School of Computing.
    Analysis of Quality of Experience by applying Fuzzy logic: A study on response time2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    To be successful in today's competitive market, service providers should look at user's satisfaction as a critical key. In order to gain a better understanding of customers' expectations, a proper evaluations which considers intrinsic characteristics of perceived quality of service is needed. Due to the subjective nature of quality, the vagueness of human judgment and the uncertainty about the degree of users' linguistic satisfaction, fuzziness is associated with quality of experience. Considering the capability of Fuzzy logic in dealing with imprecision and qualitative knowledge, it would be wise to apply it as a powerful mathematical tool for analyzing the quality of experience (QoE). This thesis proposes a fuzzy procedure to evaluate the quality of experience. In our proposed methodology, we provide a fuzzy relationship between QoE and Quality of Service (QoS) parameters. To identify this fuzzy relationship a new term called Fuzzi ed Opinion Score (FOS) representing a fuzzy quality scale is introduced. A fuzzy data mining method is applied to construct the required number of fuzzy sets. Then, the appropriate membership functions describing fuzzy sets are modeled and compared with each other. The proposed methodology will assist service providers for better decision-making and resource management.

  • 269.
    Atilmis, Birkan
    et al.
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Hoff, Linda
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    IPv6: Inte längre frågan OM och inte så mycket NÄR utan snarare HUR!2001Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Problemområde: Idag har Internet blivit var mans egendom. Detta innebär tyvärr en hel del problem. Det tydligaste vi står inför idag är att IP-adresserna håller på att ta slut. För att bli av med problemet används olika temporära lösningar ("lappningsteknik") men även en permanent lösning utvecklas, nämligen Ipv6 (Internet Protocol version 6).Det nya protokollet löser adressbristen men har även många andra funktioner så som säkerhet och bättre routinglösningar. Vi ställde oss då frågan varför har ingen övergång skett trots dessa fördelar. Frågeställning: Var i övergången mellan IPv4 och IPv6 står vi idag? Varför har inte övergången mellan IPv4 och IPv6 redan skett? Vilka är de största anledningarna och vilka fler möjliga finns det? Slutsats: Arbetet visar att tiden för en övergång ännu inte är inne. Huvudanledningarna är att det saknas produkter och en allmän anledning för en migration.

  • 270.
    Auer, Florian
    et al.
    University of Innsbruck, AUT.
    Felderer, Michael
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Lenarduzzi, Valentina
    Tampere University of Technology, FIN.
    Towards defining a microservice migration framework2018In: ACM International Conference Proceeding Series, Association for Computing Machinery , 2018, Vol. Part F147763Conference paper (Refereed)
    Abstract [en]

    Microservices are more and more popular. As a result, some companies started to believe that microservices are the solution to all of their problems and rush to adopt microservices without sufficient knowledge about the impacts. Most of the time they expect to decrease their maintenance effort or to ease the deployment process. However, re-architecting a system to microservices is not always beneficial. In this work we propose a work-plan to identify a decision framework that supports practitioners in the understanding of possible migration based benefits and issues. This will lead to more reasoned decisions and mitigate the risk of migration. © 2018 Copyright held by the owner/author(s).

  • 271.
    Augustsson, Christopher
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Dynamic vs Static user-interface: Which one is easier to learn? And will it make you more efficient?2019Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Excel offers great flexibility and allows non-programmers to create complex functionality – but at the same time, it can become very nested with cells pointing to other cells, especially if there have been many changes over a more extended period. This has happened to ICS – a small company who has its focus on calibration, out of an array of different things relating to material testing. The system they have for field calibrations today have been overly complicated and hard to maintain and consists of multiple Excel spreadsheets. The conclusion has been that a new system needs to be developed – but question how, remains. By creating a prototype using modern web-technologies, this study has evaluated if a web application can meet the specific functional requirements ICS have and if it is a suitable solution for a new system. The prototype was put under manual user test, and the results find that the prototype meets all the requirements, meaning that a webapplication could work as a replacement. During the user tests, this study has also evaluated the differences in learnability and efficiency of users, between the static user interface of the current Excel-based system and the dynamic user interface of the web-based prototype. The users have performed a calibration with both systems, and parameters such as time to completion or number of errors made have been recorded. By comparing the test results from both systems, this study has concluded that a dynamic user interface is more likely to improve learnability for novice users, but have a low impact on efficiency for expert users.

  • 272. Aurum, Aybüke
    et al.
    Jeffery, RossWohlin, ClaesHandzic, Meliha
    Managing Software Engineering Knowledge2003Collection (editor) (Other academic)
  • 273. Aurum, Aybüke
    et al.
    Petersson, Håkan
    Wohlin, Claes
    State-of-the-art: Software Inspections after 25 Years2002In: Software testing, verification & reliability, ISSN 0960-0833, E-ISSN 1099-1689, Vol. 12, no 3, p. 133-154Article in journal (Refereed)
    Abstract [en]

    Software inspections, which were originally developed by Michael Fagan in 1976, are an important means to verify and achieve sufficient quality in many software projects today. Since Fagan's initial work, the importance of software inspections has been long recognized by software developers and many organizations. Various proposals have been made by researchers in the hope of improving Fagan's inspection method. The proposals include structural changes to the process and several types of support for the inspection process. Most of the proposals have been empirically investigated in different studies. This is a review paper focusing on the software inspection process in the light of Fagan's inspection method and it summarizes and reviews other types of software inspection processes that have emerged in the last 25 years. This paper also addresses important issues related to the inspection process and examines experimental studies and their findings that are of interest with the purpose of identifying future avenues of research in software inspection.

  • 274. Aurum, Aybüke
    et al.
    Wohlin, Claes
    A Value-Based Approach in Requirements Engineering: Explaining Some of the Fundamental Concepts2007Conference paper (Refereed)
  • 275. Aurum, Aybüke
    et al.
    Wohlin, Claes
    Aligning Requirements with Business Objectives: A Framework for Requirements Engineering Decisions2005Conference paper (Refereed)
    Abstract [en]

    As software development continues to increase in complexity, involving far-reaching consequences, there is a need for decision support to improve the decision making process in requirements engineering (RE) activities. This research begins with a detailed investigation of the complexity of decision making during RE activities on organizational, product and project levels. Secondly, it presents a conceptual model which describes the RE decision making environment in terms of stakeholders, information requirements, decision types and business objectives. The purpose of this model is to facilitate the development of decision support systems in RE and to help further structure and analyse the decision making process in RE.

  • 276. Aurum, Aybüke
    et al.
    Wohlin, Claes
    Applying Decision-Making Models in Requirements Engineering.2002Conference paper (Refereed)
  • 277. Aurum, Aybüke
    et al.
    Wohlin, Claes
    The Fundamental Nature of Requirements Engineering Activities as a Decision-Making Process2003In: Information and Software Technology, ISSN 0950-5849, E-ISSN 1873-6025, Vol. 45, no 14, p. 945-954Article in journal (Refereed)
    Abstract [en]

    The requirements engineering (RE) process is a decision-rich complex problem solving activity. This paper examines the elements of organization-oriented macro decisions as well as process-oriented micro decisions in the RE process and illustrates how to integrate classical decision-making models with RE process models. This integration helps in formulating a common vocabulary and model to improve the manageability of the RE process, and contributes towards the learning process by validating and verifying the consistency of decision-making in RE activities.

  • 278. Aurum, Aybüke
    et al.
    Wohlin, Claes
    Petersson, Håkan
    Increasing the Understanding of Effectiveness in Software Inspections Using Published Data Sets2005In: Journal of Research and Practice in Information Technology, ISSN 1443-458X , Vol. 37, no 3, p. 51-64Article in journal (Refereed)
    Abstract [en]

    Since its inception into software engineering software inspection has been viewed as a cost-effective way of increasing software quality. Despite this many questions remain unanswered regarding, for example, ideal team size or cost effectiveness. This paper addresses some of these questions by performing an analysis using 30 published data sets from empirical experiments of software inspections. The main question is concerned with determining a suitable team size for software inspections. The effectiveness of different team sizes is also studied. Furthermore, the differences in mean effectiveness between different team sizes are investigated based on the inspection environmental context, document types and reading technique. It is concluded that it is possible to choose a suitable team size based on the effectiveness of inspections. This can be used as a tool to assist in the planning of inspections. A particularly interesting result is that variation in the effectiveness between different teams is considerably higher for certain types of documents than for others. Our findings contain important information for anyone planning, controlling or managing software inspections.

  • 279. Aurum, Aybüke
    et al.
    Wohlin, Claes
    Porter, A.
    Aligning Software Project Decisions: A Case Study2006In: International Journal of Software Engineering and Knowledge Engineering, ISSN 0218-1940 , Vol. 16, no 6, p. 795-818Article in journal (Refereed)
  • 280. Avdonina, Elena D.
    et al.
    Ibragimov, Nail H.
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Heat conduction in anisotropic media: Nonlinear self-adjointness and conservation laws2012In: Discontinuity, Nonlinearity and Complexity, ISSN 2164-6376, Vol. 1, no 3, p. 237-251Article in journal (Refereed)
    Abstract [en]

    Nonlinear self-adjointness of the anisotropic nonlinear heat equation is investigated. Mathematical models of heat conduction in anisotropic media with a source are considered and a class of self-adjoint models is identified. Conservation laws corresponding to the symmetries of the equations in question are computed.

  • 281. Avdonina, Elena D.
    et al.
    Ibragimov, Nail H.
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Khamitova, Raisa
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Exact solutions of gasdynamic equations obtained by the method of conservation laws2013In: Communications in nonlinear science & numerical simulation, ISSN 1007-5704, E-ISSN 1878-7274, Vol. 18Article in journal (Refereed)
    Abstract [en]

    In the present paper, the recent method of conservation laws for constructing exact solutions for systems of nonlinear partial differential equations is applied to the gasdynamic equations describing one-dimensional and three-dimensional polytropic flows. In the one-dimensional case singular solutions are constructed in closed forms. In the threedimensional case several conservation laws are used simultaneously. It is shown that the method of conservation laws leads to particular solutions different from group invariant solutions.

  • 282. Avritzer, Alberto
    et al.
    Beecham, Sarah
    Britto, Ricardo
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Kroll, Josiane
    Menaché, Daniel
    Noll, John
    Paasivaara, Maria
    Extending Survivability Models for Global Software Development with Media Synchronicity Theory2015In: Proceeding of the IEEE 10th International Conference on Global Software Engineering, IEEE Communications Society, 2015, p. 23-32Conference paper (Refereed)
    Abstract [en]

    In this paper we propose a new framework to assess survivability of software projects accounting for media capability details as introduced in Media Synchronicity Theory (MST). Specifically, we add to our global engineering framework the assessment of the impact of inadequate conveyance and convergence available in the communication infrastructure selected to be used by the project, on the system ability to recover from project disasters. We propose an analytical model to assess how the project recovers from project disasters related to process and communication failures. Our model is based on media synchronicity theory to account for how information exchange impacts recovery. Then, using the proposed model we evaluate how different interventions impact communication effectiveness. Finally, we parameterize and instantiate the proposed survivability model based on a data gathering campaign comprising thirty surveys collected from senior global software development experts at ICGSE'2014 and GSD'2015.

  • 283.
    Avutu, Neeraj
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Performance Evaluation of MongoDB on Amazon Web Service and OpenStack2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context

    MongoDB is an open-source, scalable, NoSQL database that distributes the data over many commodity servers. It provides no single point of failure by copying and storing the data in different locations. MongoDB uses a master-slave design rather than the ring topology used by Cassandra. Virtualization is the technique used for accessing multiple machines in a single host and utilizing the various virtual machines. It is the fundamental technology, which allows cloud computing to provide resource sharing among the users.

    Objectives

    Studying and identifying MongoDB, Virtualization on AWS and OpenStack. Experiments were conducted to identify the CPU utilization associated when Mongo DB instances are deployed on AWS and physical server arrangement. Understanding the effect of Replication in the Mongo DB instances and its effect on MongoDB concerning throughput, CPU utilization and latency.

    Methods

    Initially, a literature review is conducted to design the experiment with the mentioned problems. A three node MongoDB cluster runs on Amazon EC2 and OpenStack Nova with Ubuntu 16.04 LTS as an operating system. Latency, throughput and CPU utilization were measured using this setup. This procedure was repeated for five nodes MongoDB cluster and three nodes production cluster with six types of workloads of YCSB.

    Results

    Virtualization overhead has been identified in terms of CPU utilization and the effects of virtualization on MongoDB are found out in terms of CPU utilization, latency and throughput.

    Conclusions

    It is concluded that there is a decrease in latency and increases throughput with the increase in nodes. Due to replication, increase in latency was observed.

  • 284.
    Awan, Nasir Majeed
    et al.
    Blekinge Institute of Technology, School of Computing.
    Alvi, Adnan Khadem
    Blekinge Institute of Technology, School of Computing.
    Predicting software test effort in iterative development using a dynamic Bayesian network2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    It is important to manage iterative projects in a way to maximize quality and minimize cost. To achieve high quality, accurate project estimates are of high importance. It is challenging to predict the effort that is required to perform test activities in an iterative development. If testers put extra effort in testing then schedule might be delayed, however, if testers spend less effort then quality could be affected. Currently there is no model for test effort prediction in iterative development to overcome such challenges. This paper introduces and validates a dynamic Bayesian network to predict test effort in iterative software development. In this research work, the proposed framework is evaluated in a number of ways: First, the framework behavior is observed by considering different parameters and performing initial validation. Then secondly, the framework is validated by incorporating data from two industrial projects. The accuracy of the results has been verified through different prediction accuracy measurements and statistical tests. The results from the verification confirmed that the framework has the ability to predict test effort in iterative projects accurately.

  • 285.
    Awan, Rashid
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Requirements Engineering Process Maturity Model for Market Driven Projects: The REPM-M Model2005Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Several software projects are over budgeted or have to face failures during operations. One big reason of this is Software Company develops wrong software due to wrong interpretation of requirements. Requirements engineering is one of the well known discipline within Software engineering which deals with this problem. RE is the process of eliciting, analyzing and specifying requirements so that there won’t be any ambiguity between the development company and the customers. Another emerging discipline within requirements engineering is requirements engineering for market driven projects. It deals with the requirements engineering of a product targeting a mass market. In this thesis, a maturity model is developed which can be used to assess the maturity of requirements engineering process for market driven projects. The objective of this model is to provide a quick assessment tool through which a company would be able to know what are the strengths and weaknesses of their requirements engineering process.

  • 286.
    Awan, Zafar Iqbal
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Azim, Abdul
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Network Emulation, Pattern Based Traffic Shaping and KauNET Evaluation2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Quality of Service is major factor for a successful business in modern and future network services. A minimum level of services is assured indulging quality of Experience for modern real time communication introducing user satisfaction with perceived service quality. Traffic engineering can be applied to provide better services to maintain or enhance user satisfaction through reactive and preventive traffic control mechanisms. Preventive traffic control can be more effective to manage the network resources through admission control, scheduling, policing and traffic shaping mechanisms maintaining a minimum level before it get worse and affect user perception. Accuracy, dynamicity, uniformity and reproducibility are objectives of vast research in network traffic. Real time tests, simulation and network emulation are applied to test uniformity, accuracy, reproducibility and dynamicity. Network Emulation is performed over experimental network to test real time application, protocol and traffic parameters. DummyNet is a network emulator and traffic shaper which allows nondeterministic placement of packet losses, delays and bandwidth changes. KauNet shaper is a network emulator which creates traffic patterns and applies these patterns for exact deterministic placement of bit-errors, packet losses, delay changes and bandwidth changes. An evaluation of KauNet with different patterns for packet losses, delay changes and bandwidth changes on emulated environment is part of this work. The main motivation for this work is to check the possibility to delay and drop the packets of a transfer/session in the same way as it has happened before (during the observation period). This goal is achieved to some extent using KauNet but some issues with pattern repetitions are still needed to be solved to get better results. The idea of history and trace-based traffic shaping using KauNet is given to make this possibility a reality.

  • 287.
    AWOMEWE, Alaba-Femi
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Monitoring the volatility in a process which reflects trading in the financial market2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Recently, the financial market has become an area of increased research interest for mathematician and statisticians. The Black and Scholes breakthrough in this area triggered a lot of new research activity. Commonly the research concerns the log returns of assets (shares, bond, foreign exchange, option). The variation in the log returns is called volatility and it is widely studied and because of its relevance for applications in the financial world. The volatility is mostly used for measuring the risk and also for forecasting future prices. In this research work a process of trading activities is considered. It is assumed that at a random time-point a parameter change in the laws of the trading occurs, indicating changed trading behaviour. For inferential matters about the process it is of vital importance to be able to state that such change has occurred quickly and accurately. The methods used to this end are called stopping rules which signal alarm as soon as some statistics based on-line observations goes beyond some boundary. The model considered for this process of log returns is the family of Autoregressive Conditional Heteroskedastic (ARCH) model. It is widely accepted that this well describes a lot of phenomena in the financial market. In this work statements about this process will be derived, the stopping rule will be defined, evaluated and their properties discussed.

  • 288.
    Axelsson, Arvid
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Light Field Coding Using Panoramic Projection2014Student thesis
    Abstract [en]

    A new generation of 3d displays provides depth perception without the need for glasses and allows the viewer to see content from many different directions. Providing video for these displays requires capturing the scene by several cameras at different viewpoints, the data from which together forms light field video. To encode such video with existing video coding requires a large amount of data and it increases quickly with a higher number of views, which this application needs. One such coding is the multiview extension of High Efficiency Video Coding (mv-hevc), which encodes a number of similar video streams as different layers. A new coding scheme for light field video, called Panoramic Light Field (plf), is implemented and evaluated in this thesis. The main idea behind the coding is to project all points in a scene that are visible from any of the viewpoints to a single, global view, similar to how texture mapping maps a texture onto a 3d model in computer graphics. Whereas objects ordinarily shift position in the frame as the camera position changes, this is not the case when using this projection. A visible point in space is projected to the same image pixel regardless of viewpoint, resulting in large similarities between images from different viewpoints. The similarity between the layers in light field video helps to achieve more efficient compression when the projection is combined with existing multiview coding. In order to evaluate the scheme, 3d content was created and software was developed to encode it using plf. Video using this coding is compared to existing technology: a straightforward encoding of the views using mvhevc. The results show that the plf coding performs better on the sample content at lower quality levels, while it is worse at higher bitrate due to quality loss from the projection procedure. It is concluded that plf is a promising technology and suggestions are given for future research that may improve its performance further.

  • 289.
    Axelsson, Clara
    et al.
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Gustafsson, Charlotte
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Constituting Sesame: a minor field study of a cross-cultural cooperation2002Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Internationally, universities are undergoing renewal because of technological and social changes that both increases the importance of open and flexible learning, as well as makes it practicable. Net-based education makes it possible to create courses where students can collaborate and share knowledge globally. Collaborative Learning in Virtual Communities, with the working title Sesame, is a cooperative project between Blekinge Institute of Technology, BTH, and University of Pretoria, UP that has intention to provide a course like this. The aim of this collaboration is to initiate research in, as well as to test, and evaluate methods for net-based collaborative learning to see how this can provide new perspectives for students and lecturers in both countries. One key concept in the project is ?internationalisation at home? and this means interaction and knowledge sharing between people from different countries, and cultures, without them having to physically leave their country. In this thesis we describe the phase of constituting Sesame focusing on the cooperation between the involved parties from the two countries.

  • 290. Axelsson, Edgar
    et al.
    Fathallah, Ahmed
    Rin Tohsaka –a Discord Bot for Community Management2018Independent thesis Basic level (Higher Education Diploma (Fine Arts)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Bachelors level thesis in Media Technology that aims to improve an existing concept of community development. The thesis shows how the definition of Community has advanced and changed its meanings with the help of modern technology. It is not always positive with new definitions. New problems arise within the community regarding management, ethics, and maintenance. This thesis analyses these problems and aim to solve them in a modern standard, with help of rhizomatic and participatory design. With the community shifting towards an online definition, the thesis uses the latest tools available in web technology, to build a product that uses situated knowledge as a mindset and combines participatory design and rhizomatic. This to solve the ongoing problems that online communities face with maintainability and ethics.

  • 291.
    Axelsson, Elinor
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Ip-telefoni med Skype som ett alternativ till PSTN för privatanvändare2007Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Arbetet är en praktisk och teoretisk test av IP-telefoni med Skype som ligger till grund för en jämförelse mot telefoni med PSTN (Public Switched Telephone Network) som är den vanliga telefonstandard de flesta av oss använder idag. Syftet med arbetet är att underlätta valet mellan PSTN och IP-telefoni för privatanvändare i Sverige. Arbetet är tänkt att svara på följande frågeställningar. - Hur enkelt är det att komma igång med IP-telefoni via Skype? - Hur är kvalitén på IP-telefonisamtal jämfört med PSTN? - Fungerar alla de tjänster man har med PSTN även med IP-telefoni? - Hur är användbarheten och tillgängligheten till hjälp och support med IP-telefonin? - Är det billigare att ringa med IP-telefoni och i så fall under vilka förutsättningar? I arbetet har en samling praktiska och teoretiska undersökningar genomförts för att kunna bedöma IP-telefonin med Skype inom följande bedömningsområden. Installation, funktion, kvalitet, användbarhet, kostnader, tillgänglighet och säkerhet. Till undersökningen av användbarhet har en testgrupp på 10 personer använts för att utvärdera systemets användbarhet. En praktisk test av Skypeklientens funktion och kvalitet har utförts genom ett antal provringningar. Skypelösningens tillgänglighet, kostnader och säkerhet har studerats i relevant litteratur och genom fakta på Internet. Resultaten av undersökningen visar att Skypelösningen fungerar lika bra som PSTN med avseende på funktion och kvalitet men det krävs en viss datorvana för att installera och använda lösningen vilket har en viss negativ inverkan på användbarheten. Prismässigt lönar det sig bara för de som ringer mycket utomlands, för övriga användare blir det oftast betydligt dyrare än telefoni med PSTN. Skype själva informerar tydligt om att det inte garanterar funktionen för nödsamtal vilket är en stor nackdel om man vill ersätta sin PSTN-telefon med Skypelösningen. På grund av ovanstående argument så är IP-telefoni med Skype för de flesta användare inte ett bra alternativ till PSTN.

  • 292.
    Axelsson, Jonas
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Comparison of user accuracy and speed when performing 3D game target practice using a computer monitor and virtual reality headset2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Consumer grade Virtual Reality (VR)-headsets are on the rise, and with them comes an increasing number of digital games which support VR. How players perceive the gameplay and how well they perform at the games tasks can be key factors to designing new games.

    This master’s thesis aims to evaluate if a user can performa game task, specifically a target practice, in less time and/or more accurately when using a VR-headset as opposed to a computer screen and mouse. To gather statistics and measure the differences, an experiment was conducted using a test application developed alongside this report. The experiment recorded accuracy scores and time taken in tests performed by 35 test participants using both a VR-headset and computer screen.

    The resulting data sets are presented in the results chapter of this report. A Kolmogorov-Smirnov Normality Test and Student’s paired samples t-test was performed on the data to establish its statistical significance. After analysis, the results are reviewed, discussed and conclusions are made.

    This study concludes that when performing the experiment, the use of a VR-headset decreased the users accuracy and to a lesser extent also increased the time the user took to hit all targets. An argument was made that the longer previous experience with computer screen and mouse of most users gave this method an unfair advantage. With equally long training, VR use might score similar results.

  • 293.
    Axelsson, Mattias
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Sonesson, Johan
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Business Process Performance Measurement for Rollout Success2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Business process improvement for increased product quality is of continuous importance in the software industry. Quality managers in this sector need effective, hands-on tools for decision-making in engineering projects and for rapidly spotting key improvement areas. Measurement programs are a widespread approach for introducing quality improvement in software processes, yet employing all-embracing state-of-the art quality assurance models is labor intensive. Unfortunately, these do not primarily focus on measures, revealing a need for an instant and straightforward technique for identifying and defining measures in projects without resources or need for entire measurement programs. This thesis explores and compares prevailing quality assurance models using measures, rendering the Measurement Discovery Process constructed from selected parts of the PSM and GQM techniques. The composed process is applied to an industrial project with the given prerequisites, providing a set of measures that are subsequently evaluated. In addition, the application gives foundation for analysis of the Measurement Discovery Process. The application and analysis of the process show its general applicability to projects with similar constraints as well as the importance of formal target processes and exhaustive project domain knowledge among measurement implementers. Even though the Measurement Discovery Process is subject to future refinement, it is clearly a step towards rapid delivery of tangible business performance indicators for process improvement.

  • 294.
    Axelsson, Mattis
    et al.
    Blekinge Institute of Technology, School of Planning and Media Design.
    Larsson, Sara
    Blekinge Institute of Technology, School of Planning and Media Design.
    Utvecklande AI: En studie i hur man skapar ett system för lärande AI2013Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    AI is something that has become more important in today’s games and gets higher pressure to act human and intelligent. This thesis examines which methods are preferred when creating an AI that can learn from its previous experiences. Some of the methods that are examined are tree structures, Artificial Neural Network and GoCap. By creating an application with one of the methods and a survey of how the AI in the application was perceived we got a result that showed us if the method was functional. From this we discuss if the other methods would have been more effective, how we could have improved the AI and what the future for game-AI holds.

  • 295.
    Axelsson, Olof
    Blekinge Institute of Technology, School of Planning and Media Design.
    Försäljningsmetoder för spel på Android Market2012Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Smartphones med Android som operativsystem har under de senaste åren blivit en allt större del av vår vardag. Kandidatarbetet undersöker vilken försäljningsmetod man som utvecklare ska använda sig av för att få största möjliga förtjänst, när man säljer ett spel på Androids marknad. De försäljningsmetoder, som studerats, är reklam, prissatt och in app billing. Från insamlad statistik av Google och enkäter har en förståelse kring vilken metod att föredra tagits fram. Resultaten visar att alla fungerar trots deras olikheter sinsemellan. Förslag på hur en blandning av försäljningsmetoderna kan användas och hur man applicerar rätt metod till rätt spel diskuteras och presenteras.

  • 296. Axelsson, Stefan
    The Normalised Compression Distance as a File Fragment Classifier2010In: Digital Investigation. The International Journal of Digital Forensics and Incident Response, ISSN 1742-2876, E-ISSN 1873-202X, Vol. 7, no Suppl 1, p. S24-S31Article in journal (Refereed)
    Abstract [en]

    We have applied the generalised and universal distance measure NCD—Normalised Compression Distance—to the problem of determining the type of file fragments. To enable later comparison of the results, the algorithm was applied to fragments of a publicly available corpus of files. The NCD algorithm in conjunction with the k-nearest-neighbour (k ranging from one to ten) as the classification algorithm was applied to a random selection of circa 3000 512-byte file fragments from 28 different file types. This procedure was then repeated ten times. While the overall accuracy of the n-valued classification only improved the prior probability from approximately 3.5% to circa 32%–36%, the classifier reached accuracies of circa 70% for the most successful file types. A prototype of a file fragment classifier was then developed and evaluated on new set of data (from the same corpus). Some circa 3000 fragments were selected at random and the experiment repeated five times. This prototype classifier remained successful at classifying individual file types with accuracies ranging from only slightly lower than 70% for the best class, down to similar accuracies as in the prior experiment.

  • 297. Axelsson, Stefan
    Using Normalized Compression Distance for Classifying File Fragments2010Conference paper (Refereed)
    Abstract [en]

    We have applied the generalised and universal distance measure NCD-Normalised Compression Distance-to the problem of determining the types of file fragments via example. A corpus of files that can be redistributed to other researchers in the field was developed and the NCD algorithm using k-nearest-neighbour as the classification algorithm was applied to a random selection of file fragments. The experiment covered circa 2000 fragments from 17 different file types. While the overall accuracy of the n-valued classification only improved the prior probability of the class from approximately 6% to circa 50% overall, the classifier reached accuracies of 85%-100% for the most successful file types.

  • 298. Axelsson, Stefan
    et al.
    Baca, Dejan
    Feldt, Robert
    Sidlauskas, Darius
    Kacan, Denis
    Detecting Defects with an Interactive Code Review Tool Based on Visualisation and Machine Learning2009Conference paper (Refereed)
    Abstract [en]

    Code review is often suggested as a means of improving code quality. Since humans are poor at repetitive tasks, some form of tool support is valuable. To that end we developed a prototype tool to illustrate the novel idea of applying machine learning (based on Normalised Compression Distance) to the problem of static analysis of source code. Since this tool learns by example, it is rivially programmer adaptable. As machine learning algorithms are notoriously difficult to understand operationally (they are opaque) we applied information visualisation to the results of the learner. In order to validate the approach we applied the prototype to source code from the open-source project Samba and from an industrial, telecom software system. Our results showed that the tool did indeed correctly find and classify problematic sections of code based on training examples.

  • 299.
    Axelsson, Stefan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Bajwa, Kamran Ali
    Srikanth, Mandhapati Venkata
    Blekinge Institute of Technology, School of Computing.
    File Fragment Analysis Using Normalized Compression Distance2013Conference paper (Refereed)
    Abstract [en]

    The first step when recovering deleted files using file carving is to identify the file type of a block, also called file fragment analysis. Several researchers have demonstrated the applicability of Kolmogorov complexity methods such as the normalized compression distance (NCD) to this problem. NCD methods compare the results of compressing a pair of data blocks with the compressed concatenation of the pair. One parameter that is required is the compression algorithm to be used. Prior research has identified the NCD compressor properties that yield good performance. However, no studies have focused on its applicability to file fragment analysis. This paper describes the results of experiments on a large corpus of files and file types with different block lengths. The experimental results demonstrate that, in the case of file fragment analysis, compressors with the desired properties do not perform statistically better than compressors with less computational complexity.

  • 300.
    Ayalew, Tigist
    et al.
    Blekinge Institute of Technology, School of Computing.
    Kidane, Tigist
    Blekinge Institute of Technology, School of Computing.
    Identification and Evaluation of Security Activities in Agile Projects: A Systematic Literature Review and Survey Study2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: Today’s software development industry requires high-speed software delivery from the development team. In order to do this, organizations make transformation from their conventional software development method to agile development method while preserving customer satisfaction. Even though this approach is becoming popular development method, from security point of view, it has some disadvantage. Because, this method has several constraints imposed such as lack of a complete overview of a product, higher development pace and lack of documentation. Although security-engineering (SE) process is necessary in order to build secure software, no SE process is developed specifically for agile model. As a result, SE processes that are commonly used in waterfall model are being used in agile models. However, there is a clash or disparity between the established waterfall SE processes and the ideas and methodologies proposed by the agile manifesto. This means that, while agile models work with short development increments that adapt easily to change, the existing SE processes work in plan-driven development setting and try to reduce defects found in a program before the occurrence of threats through heavy and inflexible process. This study aims at bridging the gap in agile model and security by providing insightful understanding of the SE process that are used in the current agile industry. Objectives: The objectives of this thesis are to identify and evaluate security activities from high-profile waterfall SE-process that are used in the current agile industry. Then, to suggest the most compatible and beneficial security activities to agile model based on the study results. Methods: The study involved two approaches: systematic literature review and survey. The systematic literature review has two main aims. The first aim is to gain a comprehensive understanding of security in an agile process model; the second one is to identify high-profile SE processes that are commonly used in waterfall model. Moreover, it helped to compare the thesis result with other previously done works on the area. A survey is conducted to identify and evaluate waterfall security activities that are used in the current agile industry projects. The evaluation criteria were based on the security activity integration cost and benefit provides to agile projects. Results: The results of the systematic review are organized in a tabular form for clear understanding and easy analysis. High-profile SE processes and their activities are obtained. These results are used as an input for the survey study. From the survey study, security activities that are used in the current agile industry are identified. Furthermore, the identified security activities are evaluated in terms of benefit and cost. As a result the best security activities, that are compatible and beneficial, are investigated to agile process model. Conclusions: To develop secure software in agile model, there is a need of SE-process or practice that can address security issues in every phase of the agile project lifecycle. This can be done either by integrating the most compatible and beneficial security activities from waterfall SE processes with agile process or by creating new SE-process. In this thesis, it has been found that, from the investigated high-profile waterfall SE processes, none of the SE processes was fully compatible and beneficial to agile projects.

3456789 251 - 300 of 4798
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf