Change search
Refine search result
1234567 151 - 200 of 2535
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 151.
    Arredal, Martin
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Eye Tracking’s Impact on Player Performance and Experience in a 2D Space Shooter Video Game.2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Background. Although a growing market, most of the commercially available gamestoday that features eye tracking support is rendered in a 3D perspective. Games ren-dered in 2D have seen little support for eye trackers from developers. By comparing the differences in player performance and experience between an eye tracker and acomputer mouse when playing a classic 2D genre: space shooter, this thesis aim tomake an argument for the implementation of eye tracking in 2D video games.

    Objectives. Create a 2D space shooter video game where movement will be handledthrough a keyboard but the input method for aiming will alter between a computermouse and an eye tracker.

    Methods. Using a Tobii EyeX eye tracker, an experiment was conducted with fif-teen participants. To measure their performance, three variables was used: accuracy,completion time and collisions. The participants played two modes of a 2D spaceshooter video game in a controlled environment. Depending on which mode wasplayed, the input method for aiming was either an eye tracker or a computer mouse.The movement was handled using a keyboard for both modes. When the modes hadbeen completed, a questionnaire was presented where the participants would ratetheir experience playing the game with each input method.

    Results. The computer mouse had a better performance in two out of three per-formance variables. On average the computer mouse had a better accuracy andcompletion time but more collisions. However, the data gathered from the question-naire shows that the participants had on average a better experience when playingwith an eye tracker

    Conclusions. The results from the experiment shows a better performance for par-ticipants using the computer mouse, but participants felt more immersed with the eyetracker and giving it a better score on all experience categories. With these results,this study hope to encourage developers to implement eye tracking as an interactionmethod for 2D video games. However, future work is necessary to determine if theexperience and performance increase or decrease as the playtime gets longer.

  • 152.
    Aruchamy, Logabharathi
    Blekinge Institute of Technology, School of Computing.
    Analysis of Radio Access Network Buffer Filling Based on Real Network Data2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The 3G and 4G networks have drastically improved availability and quality in data transmission for bandwidth hungry services such as video streaming and location-based services. As 3G networks are very widely deployed, there exists increased capacity requirement and transport channel allocation to simultaneous users under a particular cell. Due to this reason, adequate resources are not available, which in turn degrades both service quality and user experienced quality. This research aims at understanding the characteristics of buffer filling during dedicated channel (DCH) transmission under fixed bit-rate assumptions on a per-user level taking different services into consideration. Furthermore, the resource utilisation in terms of empty buffer durations and user throughput achieved during dedicated channel transmission are also analysed for different data services existing in the mobile networks. The traces are collected from a real network and characteristics of the traffic are analysed prior to understanding its buffer filling in Radio Network Controller (RNC) during downlink data transmission. Furthermore, the buffer is modelled with some series of assumptions on channel bit-rates and simulations are performed taking single user scenario into consideration, for different services with the help of obtained traces as input to the buffer. This research is helpful in understanding the RNC buffer filling for different services, in turn yielding possible understanding on the existing transport channel switching scenario. With the help of analysing the buffer filling for different services and transport channel utilisation, we learn that most of the data services show low DCH utilisation of approximately around 20% and also found to have 80% of the total DCH session duration with empty buffer, causing sub-optimal radio resource utilization.

  • 153. Arvidsson, Åke
    et al.
    Hederstierna, Anders
    Hellmer, Stefan
    Simple and Accurate Forecasting of the Market for Cellular Mobile Services2007In: Managing Traffic Performance in Converged Networks, Berlin: Springer , 2007Chapter in book (Refereed)
    Abstract [en]

    We consider the problems of explaining and forecasting the penetration and the traffic in cellular mobile networks. To this end, we create two regression models, viz. one to predict the penetration from service charges and network effects and another one to predict the traffic from service charges and diffusion and adoption effects. The results of the models can also be combined to compute the likely evolutions of essential characteristics such as Minutes of Use (MoU), Average Revenue per User (ARPU) and total revenue. Applying the models to 26 markets throughout the world we show that they perform very well. Noting the significant qualitative differences between these markets, we conclude that the model has some universality in that the results are comparable for all of them.

  • 154.
    Aryal, Dhiraj
    et al.
    Blekinge Institute of Technology, School of Computing.
    Shakya, Anup
    Blekinge Institute of Technology, School of Computing.
    A Taxonomy of SQL Injection Defense Techniques2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: SQL injection attack (SQLIA) poses a serious defense threat to web applications by allowing attackers to gain unhindered access to the underlying databases containing potentially sensitive information. A lot of methods and techniques have been proposed by different researchers and practitioners to mitigate SQL injection problem. However, deploying those methods and techniques without a clear understanding can induce a false sense of security. Classification of such techniques would provide a great assistance to get rid of such false sense of security. Objectives: This paper is focused on classification of such techniques by building taxonomy of SQL injection defense techniques. Methods: Systematic literature review (SLR) is conducted using five reputed and familiar e-databases; IEEE, ACM, Engineering Village (Inspec/Compendex), ISI web of science and Scopus. Results: 61 defense techniques are found and based on these techniques, a taxonomy of SQL injection defense techniques is built. Our taxonomy consists of various dimensions which can be grouped under two higher order terms; detection method and evaluation criteria. Conclusion: The taxonomy provides a basis for comparison among different defense techniques. Organization(s) can use our taxonomy to choose suitable owns depending on their available resources and environments. Moreover, this classification can lead towards a number of future research directions in the field of SQL injection.

  • 155.
    Asghar, Gulfam
    et al.
    Blekinge Institute of Technology, School of Computing.
    Azmi, Qanit Jawed
    Blekinge Institute of Technology, School of Computing.
    Security Issues of SIP2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Voice over IP (VoIP) services based on Session Initiation Protocol (SIP) has gained much attention as compared to other protocols like H.323 or MGCP over the last decade. SIP is the most favorite signaling protocol for the current and future IP telephony services, and it‘s also becoming the real competitor for traditional telephony services. However, the open architecture of SIP results the provided services vulnerable to different types of security threats which are similar in nature to those currently existing on the Internet. For this reason, there is an obvious need to provide some kind of security mechanisms to SIP based VOIP implementations. In this research, we will discuss the security threats to SIP and will highlight the related open issues. Although there are many threats to SIP security but we will focus mainly on the session hijacking and DoS attacks. We will demonstrate these types of attacks by introducing a model/practical test environment. We will also analyze the effect and performance of some the proposed solutions that is the use of Network Address Translation (NAT), IPSec, Virtual Private Networks (VPNs) and Firewalls (IDS/IPS) with the help of a test scenario.

  • 156.
    Ashfaq, Rana Aamir Raza
    et al.
    Blekinge Institute of Technology, School of Computing.
    Khan, Mohammad Qasim
    Blekinge Institute of Technology, School of Computing.
    Analyzing Common Criteria Shortcomings to Improve its Efficacy2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Information security has become a key concern for organizations conducting business in the current electronic era. Rapid technological development continuously creates novel security threats, making IT an uncertain infrastructure. So the security is an important factor for the vendors as well as for the consumers. To fulfill the security needs, IT companies have to adopt some standards to assure some levels that concern with the security in their product. Common Criteria (CC) is one of the standards that maintains and controls the security of IT products. Many other standards are also available to assure the security in products but like these standards CC has its own pros and cons. It does not impose predefined security rules that a product should exhibit but a language for security evaluation. CC has certain advantages due to its ability to address all the three dimensions: a) it provides opportunity for users to specify their security requirements, b) an implementation guide for the developers and c) provides comprehensive criteria to evaluate the security requirements. On the downside, it requires considerable amount of resources and is quite time consuming. Another is security requirements that it evaluates and must be defined before the project start which is in direct conflict with the rapidly changing security threat environment. In this research thesis we will analyze the core issues and find the major causes for the criticism. Many IT users in USA and UK have reservations with CC evaluation because of its limitations. We will analyze the CC shortcomings and document them that will be useful for researchers to have an idea of shortcomings associated with CC. This study will potentially be able to strengthen the CC usage with a more effective and responsive evaluation methodology for IT community.

  • 157.
    Ashraf, Imran
    et al.
    Blekinge Institute of Technology, School of Computing.
    Khokhar, Amir Shahzed
    Blekinge Institute of Technology, School of Computing.
    Principles for Distributed Databases in Telecom Environment2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Centralized databases are becoming bottleneck for organizations that are physically distributed and access data remotely. Data management is easy in centralized databases. However, it carries high communication cost and most importantly high response time. The concept of distributing the data over various locations is very attractive for such organizations. In such cases the database is fragmented into fragments and distributed to the locations where it is needed. This kind of distribution provides local control of data and the data access is also very fast in such databases. However, concurrency control, query optimization and data allocations are the factors that affect the response time and must be investigated prior to implementing distributed databases. This thesis makes the use of mixed method approach to meet its objective. In quantitative section, we performed an experiment to compare the response time of two databases; centralized and fragmented/distributed. The experiment was performed at Ericsson. A literature review was also done to find out other important response time related issues like query optimization, concurrency control and data allocation. The literature review revealed that these factors can further improve the response time in distributed environment. Results of the experiment showed a substantial decrease in the response time due to the fragmentation and distribution.

  • 158.
    Asim, Muhammad Ahsan
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Network Testing in a Testbed Simulator using Combinatorial Structures2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This report covers one of the most demanding issues of network users i.e. network testing. Network testing in this study is about performance evaluation of networks, by putting traffic load gradually to determine the queuing delay for different traffics. Testing of such operations is becoming complex and necessary due to use of real time applications such as voice and video traffic, parallel to elastic data of ordinary applications over WAN links. Huge size elastic data occupies almost 80% resources and causes delay for time sensitive traffic. Performance parameters like service outage, delay, packet loss and jitter are tested to assure the reliability factor of provided Quality of Service (QoS) in the Service Level Agreements (SLAs). Normally these network services are tested after deployment of physical networks. In this case most of the time customers have to experience unavailability (outage) of network services due to increased levels of load and stress. According to user-centric point of view these outages are violation and must be avoided by the net-centric end. In order to meet these challenges network SLAs are tested on simulators in lab environment. This study provides a solution for this problem in a form of testbed simulator named Combinatorial TestBed Simulator (CTBS). Prototype of this simulator is developed for conducting experiment. It provides a systematic approach of combinatorial structures for finding such traffic patterns that exceeds the limit of queuing delay, committed in SLAs. Combinatorics is a branch of mathematics that deals with discrete and normally finite elements. In the design of CTBS, technique of combinatorics is used to generate a variety of test data that cannot be generated manually for testing the given network scenario. To validate the design of CTBS, results obtained by pilot runs are compared with the results calculated using timeline. After validation of CTBS design, actual experiment is conducted to determine the set of traffic patterns that exceeds the threshold value of queuing delay for Voice over Internet Protocol (VOIP) traffic.

  • 159.
    Askwall, Niklas
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    Utvärderingsmetod Säkerhetskultur: Ett första steg i en valideringsprocess2013Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Företag investerar idag väldigt mycket pengar på att säkra sina fysiska och logiska tillgångar med hjälp av tekniska skyddsmekanismer. Dock är all säkerhet på något sätt beroende av den enskilde individens omdöme och kunskap. Hur går det avgöra att organisationen kan lita på individens omdöme och kunskap? Hur går det avgöra om en organisation har en god kultur kring säkerhet? Genom att utvärdera säkerhetskulturen kan organisationer få ett utökat underlag i riskhanteringsarbetet samt en bättre förmåga att hantera det som hotar verksamhetens tillgångar. Den forskning som finns idag på området säkerhetskultur är både oense kring vad som utgör god säkerhetskultur men framför allt hur kulturen ska utvärderas. Denna forskningsansats är således ett försök att ta fram en intuitiv utvärderingsmetod som organisationer kan använda för att utvärdera sin säkerhetskultur. Utvärderingsmetoden liknar en gap-analys där en organisations önskade kultur fastställs och datainsamling sker genom en enkätundersökning. Dataunderlaget sammanställs och används för att skapa ett index för den rådande kulturen i jämförelse med den önskade kulturen. I detta inledande försök testas undersökningens reliabilitet genom Cronbach's alpha och validiteten testas genom en form av konfirmatorisk faktoranalys. Resultatet visar hur ett index som representerar en organisations säkerhetskultur skapas. Det går att påvisa god reliabilitet på utvärderingsmetoden och författaren finner goda argument för nyttan av en sådan metod i det proaktiva säkerhetsarbetet. Dock har omständigheter gjort det mycket svårt att påvisa god validitet i denna inledande undersökning.

  • 160.
    Asl, Babak Ghafary
    Blekinge Institute of Technology, School of Engineering.
    A Computer Aided Detection System for Cerebral Microbleeds in Brain MRI2012Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Advances in MR technology have improved the potential for visualization of small lesions in brain images. This has resulted in the opportunity to detect cerebral microbleeds (CMBs), small hemorrhages in the brain that are known to be associated with risk of ischemic stroke and intracerebral bleeding. Currently, no computerized method is available for fully- or semi-automated detection of CMBs. In this paper, we propose a CAD system for the detection of CMBs to speed up visual analysis in population-based studies. Our method consists of three steps: (i) skull-stripping (ii) initial candidate selection (iii) reduction of false-positives using a two layer classi cation and (iv) determining the anatomical location of CMBs. The training and test sets consist of 156 subjects (448 CMBs) and 81 subjects (183 CMBs), respectively. The geometrical, intensity-based and local image descriptor features were used in the classi cation steps. The training and test sets consist of 156 subjects (448 CMBs) and 81 subjects (183 CMBs), respectively. The sensitivity for CMB detection was 90% with, on average, 4 false-positives per subject.

  • 161. Aspvall, Bengt
    et al.
    Pettersson, Eva
    Från datorernas värld2007In: Nämnaren, ISSN 0348-2723 , Vol. 34, no 2, p. 44-48Article in journal (Refereed)
  • 162. Astor, Philipp
    et al.
    Adam, Marc
    Jerčić, Petar
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Schaaff, Kristina
    Weinhardt, Christof
    Integrating biosignals into information systems: A NeuroIS tool for improving emotion regulation2013In: Journal of Management Information Systems, ISSN 0742-1222, E-ISSN 1557-928X, Vol. 30, no 3, p. 247-277Article in journal (Refereed)
    Abstract [en]

    Traders and investors are aware that emotional processes can have material consequences on their financial decision performance. However, typical learning approaches for debiasing fail to overcome emotionally driven financial dispositions, mostly because of subjects' limited capacity for self-monitoring. Our research aims at improving decision makers' performance by (1) boosting their awareness to their emotional state and (2) improving their skills for effective emotion regulation. To that end, we designed and implemented a serious game-based NeuroIS tool that continuously displays the player's individual emotional state, via biofeedback, and adapts the difficulty of the decision environment to this emotional state. The design artifact was then evaluated in two laboratory experiments. Taken together, our study demonstrates how information systems design science research can contribute to improving financial decision making by integrating physiological data into information technology artifacts. Moreover, we provide specific design guidelines for how biofeedback can be integrated into information systems

  • 163.
    Ataeian, Seyed Mohsen
    et al.
    Blekinge Institute of Technology, School of Computing.
    Darbandi, Mehrnaz Jaberi
    Blekinge Institute of Technology, School of Computing.
    Analysis of Quality of Experience by applying Fuzzy logic: A study on response time2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    To be successful in today's competitive market, service providers should look at user's satisfaction as a critical key. In order to gain a better understanding of customers' expectations, a proper evaluations which considers intrinsic characteristics of perceived quality of service is needed. Due to the subjective nature of quality, the vagueness of human judgment and the uncertainty about the degree of users' linguistic satisfaction, fuzziness is associated with quality of experience. Considering the capability of Fuzzy logic in dealing with imprecision and qualitative knowledge, it would be wise to apply it as a powerful mathematical tool for analyzing the quality of experience (QoE). This thesis proposes a fuzzy procedure to evaluate the quality of experience. In our proposed methodology, we provide a fuzzy relationship between QoE and Quality of Service (QoS) parameters. To identify this fuzzy relationship a new term called Fuzzi ed Opinion Score (FOS) representing a fuzzy quality scale is introduced. A fuzzy data mining method is applied to construct the required number of fuzzy sets. Then, the appropriate membership functions describing fuzzy sets are modeled and compared with each other. The proposed methodology will assist service providers for better decision-making and resource management.

  • 164.
    Atilmis, Birkan
    et al.
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Hoff, Linda
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    IPv6: Inte längre frågan OM och inte så mycket NÄR utan snarare HUR!2001Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Problemområde: Idag har Internet blivit var mans egendom. Detta innebär tyvärr en hel del problem. Det tydligaste vi står inför idag är att IP-adresserna håller på att ta slut. För att bli av med problemet används olika temporära lösningar ("lappningsteknik") men även en permanent lösning utvecklas, nämligen Ipv6 (Internet Protocol version 6).Det nya protokollet löser adressbristen men har även många andra funktioner så som säkerhet och bättre routinglösningar. Vi ställde oss då frågan varför har ingen övergång skett trots dessa fördelar. Frågeställning: Var i övergången mellan IPv4 och IPv6 står vi idag? Varför har inte övergången mellan IPv4 och IPv6 redan skett? Vilka är de största anledningarna och vilka fler möjliga finns det? Slutsats: Arbetet visar att tiden för en övergång ännu inte är inne. Huvudanledningarna är att det saknas produkter och en allmän anledning för en migration.

  • 165.
    Avutu, Neeraj
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Performance Evaluation of MongoDB on Amazon Web Service and OpenStack2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context

    MongoDB is an open-source, scalable, NoSQL database that distributes the data over many commodity servers. It provides no single point of failure by copying and storing the data in different locations. MongoDB uses a master-slave design rather than the ring topology used by Cassandra. Virtualization is the technique used for accessing multiple machines in a single host and utilizing the various virtual machines. It is the fundamental technology, which allows cloud computing to provide resource sharing among the users.

    Objectives

    Studying and identifying MongoDB, Virtualization on AWS and OpenStack. Experiments were conducted to identify the CPU utilization associated when Mongo DB instances are deployed on AWS and physical server arrangement. Understanding the effect of Replication in the Mongo DB instances and its effect on MongoDB concerning throughput, CPU utilization and latency.

    Methods

    Initially, a literature review is conducted to design the experiment with the mentioned problems. A three node MongoDB cluster runs on Amazon EC2 and OpenStack Nova with Ubuntu 16.04 LTS as an operating system. Latency, throughput and CPU utilization were measured using this setup. This procedure was repeated for five nodes MongoDB cluster and three nodes production cluster with six types of workloads of YCSB.

    Results

    Virtualization overhead has been identified in terms of CPU utilization and the effects of virtualization on MongoDB are found out in terms of CPU utilization, latency and throughput.

    Conclusions

    It is concluded that there is a decrease in latency and increases throughput with the increase in nodes. Due to replication, increase in latency was observed.

  • 166.
    Awan, Zafar Iqbal
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Azim, Abdul
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Network Emulation, Pattern Based Traffic Shaping and KauNET Evaluation2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Quality of Service is major factor for a successful business in modern and future network services. A minimum level of services is assured indulging quality of Experience for modern real time communication introducing user satisfaction with perceived service quality. Traffic engineering can be applied to provide better services to maintain or enhance user satisfaction through reactive and preventive traffic control mechanisms. Preventive traffic control can be more effective to manage the network resources through admission control, scheduling, policing and traffic shaping mechanisms maintaining a minimum level before it get worse and affect user perception. Accuracy, dynamicity, uniformity and reproducibility are objectives of vast research in network traffic. Real time tests, simulation and network emulation are applied to test uniformity, accuracy, reproducibility and dynamicity. Network Emulation is performed over experimental network to test real time application, protocol and traffic parameters. DummyNet is a network emulator and traffic shaper which allows nondeterministic placement of packet losses, delays and bandwidth changes. KauNet shaper is a network emulator which creates traffic patterns and applies these patterns for exact deterministic placement of bit-errors, packet losses, delay changes and bandwidth changes. An evaluation of KauNet with different patterns for packet losses, delay changes and bandwidth changes on emulated environment is part of this work. The main motivation for this work is to check the possibility to delay and drop the packets of a transfer/session in the same way as it has happened before (during the observation period). This goal is achieved to some extent using KauNet but some issues with pattern repetitions are still needed to be solved to get better results. The idea of history and trace-based traffic shaping using KauNet is given to make this possibility a reality.

  • 167.
    Axelsson, Arvid
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Light Field Coding Using Panoramic Projection2014Student thesis
    Abstract [en]

    A new generation of 3d displays provides depth perception without the need for glasses and allows the viewer to see content from many different directions. Providing video for these displays requires capturing the scene by several cameras at different viewpoints, the data from which together forms light field video. To encode such video with existing video coding requires a large amount of data and it increases quickly with a higher number of views, which this application needs. One such coding is the multiview extension of High Efficiency Video Coding (mv-hevc), which encodes a number of similar video streams as different layers. A new coding scheme for light field video, called Panoramic Light Field (plf), is implemented and evaluated in this thesis. The main idea behind the coding is to project all points in a scene that are visible from any of the viewpoints to a single, global view, similar to how texture mapping maps a texture onto a 3d model in computer graphics. Whereas objects ordinarily shift position in the frame as the camera position changes, this is not the case when using this projection. A visible point in space is projected to the same image pixel regardless of viewpoint, resulting in large similarities between images from different viewpoints. The similarity between the layers in light field video helps to achieve more efficient compression when the projection is combined with existing multiview coding. In order to evaluate the scheme, 3d content was created and software was developed to encode it using plf. Video using this coding is compared to existing technology: a straightforward encoding of the views using mvhevc. The results show that the plf coding performs better on the sample content at lower quality levels, while it is worse at higher bitrate due to quality loss from the projection procedure. It is concluded that plf is a promising technology and suggestions are given for future research that may improve its performance further.

  • 168.
    Axelsson, Elinor
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Ip-telefoni med Skype som ett alternativ till PSTN för privatanvändare2007Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Arbetet är en praktisk och teoretisk test av IP-telefoni med Skype som ligger till grund för en jämförelse mot telefoni med PSTN (Public Switched Telephone Network) som är den vanliga telefonstandard de flesta av oss använder idag. Syftet med arbetet är att underlätta valet mellan PSTN och IP-telefoni för privatanvändare i Sverige. Arbetet är tänkt att svara på följande frågeställningar. - Hur enkelt är det att komma igång med IP-telefoni via Skype? - Hur är kvalitén på IP-telefonisamtal jämfört med PSTN? - Fungerar alla de tjänster man har med PSTN även med IP-telefoni? - Hur är användbarheten och tillgängligheten till hjälp och support med IP-telefonin? - Är det billigare att ringa med IP-telefoni och i så fall under vilka förutsättningar? I arbetet har en samling praktiska och teoretiska undersökningar genomförts för att kunna bedöma IP-telefonin med Skype inom följande bedömningsområden. Installation, funktion, kvalitet, användbarhet, kostnader, tillgänglighet och säkerhet. Till undersökningen av användbarhet har en testgrupp på 10 personer använts för att utvärdera systemets användbarhet. En praktisk test av Skypeklientens funktion och kvalitet har utförts genom ett antal provringningar. Skypelösningens tillgänglighet, kostnader och säkerhet har studerats i relevant litteratur och genom fakta på Internet. Resultaten av undersökningen visar att Skypelösningen fungerar lika bra som PSTN med avseende på funktion och kvalitet men det krävs en viss datorvana för att installera och använda lösningen vilket har en viss negativ inverkan på användbarheten. Prismässigt lönar det sig bara för de som ringer mycket utomlands, för övriga användare blir det oftast betydligt dyrare än telefoni med PSTN. Skype själva informerar tydligt om att det inte garanterar funktionen för nödsamtal vilket är en stor nackdel om man vill ersätta sin PSTN-telefon med Skypelösningen. På grund av ovanstående argument så är IP-telefoni med Skype för de flesta användare inte ett bra alternativ till PSTN.

  • 169.
    Axelsson, Mattis
    et al.
    Blekinge Institute of Technology, School of Planning and Media Design.
    Larsson, Sara
    Blekinge Institute of Technology, School of Planning and Media Design.
    Utvecklande AI: En studie i hur man skapar ett system för lärande AI2013Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    AI is something that has become more important in today’s games and gets higher pressure to act human and intelligent. This thesis examines which methods are preferred when creating an AI that can learn from its previous experiences. Some of the methods that are examined are tree structures, Artificial Neural Network and GoCap. By creating an application with one of the methods and a survey of how the AI in the application was perceived we got a result that showed us if the method was functional. From this we discuss if the other methods would have been more effective, how we could have improved the AI and what the future for game-AI holds.

  • 170. Axelsson, Stefan
    The Normalised Compression Distance as a File Fragment Classifier2010In: Digital Investigation. The International Journal of Digital Forensics and Incident Response, ISSN 1742-2876, E-ISSN 1873-202X, Vol. 7, no Suppl 1, p. S24-S31Article in journal (Refereed)
    Abstract [en]

    We have applied the generalised and universal distance measure NCD—Normalised Compression Distance—to the problem of determining the type of file fragments. To enable later comparison of the results, the algorithm was applied to fragments of a publicly available corpus of files. The NCD algorithm in conjunction with the k-nearest-neighbour (k ranging from one to ten) as the classification algorithm was applied to a random selection of circa 3000 512-byte file fragments from 28 different file types. This procedure was then repeated ten times. While the overall accuracy of the n-valued classification only improved the prior probability from approximately 3.5% to circa 32%–36%, the classifier reached accuracies of circa 70% for the most successful file types. A prototype of a file fragment classifier was then developed and evaluated on new set of data (from the same corpus). Some circa 3000 fragments were selected at random and the experiment repeated five times. This prototype classifier remained successful at classifying individual file types with accuracies ranging from only slightly lower than 70% for the best class, down to similar accuracies as in the prior experiment.

  • 171. Axelsson, Stefan
    et al.
    Baca, Dejan
    Feldt, Robert
    Sidlauskas, Darius
    Kacan, Denis
    Detecting Defects with an Interactive Code Review Tool Based on Visualisation and Machine Learning2009Conference paper (Refereed)
    Abstract [en]

    Code review is often suggested as a means of improving code quality. Since humans are poor at repetitive tasks, some form of tool support is valuable. To that end we developed a prototype tool to illustrate the novel idea of applying machine learning (based on Normalised Compression Distance) to the problem of static analysis of source code. Since this tool learns by example, it is rivially programmer adaptable. As machine learning algorithms are notoriously difficult to understand operationally (they are opaque) we applied information visualisation to the results of the learner. In order to validate the approach we applied the prototype to source code from the open-source project Samba and from an industrial, telecom software system. Our results showed that the tool did indeed correctly find and classify problematic sections of code based on training examples.

  • 172.
    Ayalew, Tigist
    et al.
    Blekinge Institute of Technology, School of Computing.
    Kidane, Tigist
    Blekinge Institute of Technology, School of Computing.
    Identification and Evaluation of Security Activities in Agile Projects: A Systematic Literature Review and Survey Study2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: Today’s software development industry requires high-speed software delivery from the development team. In order to do this, organizations make transformation from their conventional software development method to agile development method while preserving customer satisfaction. Even though this approach is becoming popular development method, from security point of view, it has some disadvantage. Because, this method has several constraints imposed such as lack of a complete overview of a product, higher development pace and lack of documentation. Although security-engineering (SE) process is necessary in order to build secure software, no SE process is developed specifically for agile model. As a result, SE processes that are commonly used in waterfall model are being used in agile models. However, there is a clash or disparity between the established waterfall SE processes and the ideas and methodologies proposed by the agile manifesto. This means that, while agile models work with short development increments that adapt easily to change, the existing SE processes work in plan-driven development setting and try to reduce defects found in a program before the occurrence of threats through heavy and inflexible process. This study aims at bridging the gap in agile model and security by providing insightful understanding of the SE process that are used in the current agile industry. Objectives: The objectives of this thesis are to identify and evaluate security activities from high-profile waterfall SE-process that are used in the current agile industry. Then, to suggest the most compatible and beneficial security activities to agile model based on the study results. Methods: The study involved two approaches: systematic literature review and survey. The systematic literature review has two main aims. The first aim is to gain a comprehensive understanding of security in an agile process model; the second one is to identify high-profile SE processes that are commonly used in waterfall model. Moreover, it helped to compare the thesis result with other previously done works on the area. A survey is conducted to identify and evaluate waterfall security activities that are used in the current agile industry projects. The evaluation criteria were based on the security activity integration cost and benefit provides to agile projects. Results: The results of the systematic review are organized in a tabular form for clear understanding and easy analysis. High-profile SE processes and their activities are obtained. These results are used as an input for the survey study. From the survey study, security activities that are used in the current agile industry are identified. Furthermore, the identified security activities are evaluated in terms of benefit and cost. As a result the best security activities, that are compatible and beneficial, are investigated to agile process model. Conclusions: To develop secure software in agile model, there is a need of SE-process or practice that can address security issues in every phase of the agile project lifecycle. This can be done either by integrating the most compatible and beneficial security activities from waterfall SE processes with agile process or by creating new SE-process. In this thesis, it has been found that, from the investigated high-profile waterfall SE processes, none of the SE processes was fully compatible and beneficial to agile projects.

  • 173. Ayani, Rassul
    et al.
    Ismailov, Yuri
    Liljenstam, Michael
    Popescu, Adrian
    Rajaei, Hassan
    Rönngren, Robert
    Modeling and Simulation of a High Speed LAN1995In: Simulation (San Diego, Calif.), ISSN 0037-5497, E-ISSN 1741-3133, Vol. 64, no 1, p. 7-14Article in journal (Refereed)
    Abstract [en]

    Simulation is a tool that can be used to assess functionality and performance of communication networks and protocols. However, efficient simulation of complex communication systems is not a trivial task. In this paper, we discuss modeling and simulation of bus-based communication networks and present the results of modeling and simulation of a multigigabit/s LAN. We used parallel simulation techniques to reduce the simulation time of the LAN and implemented both an optimistic and a conservative parallel simulation scheme. Our experimental results on a shared memory multiprocessor indicate that the conservative parallel simulation scheme is superior to the optimistic one for this specific application. The parallel simulator based on the conservative scheme demonstates a linear speedup for large networks.

  • 174.
    Ayichiluhm, Theodros
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mohan, Vivek
    Blekinge Institute of Technology, School of Computing.
    IPv6 Monitoring and Flow Detection2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    IPv6 Privacy extensions, implemented in major operating systems, hide user’s identity by using a temporary and a randomly generated IPv6 addresses rather than using the former, EUI-64 format where the MAC address is part of the IPv6 address. This solution for privacy has created a problem for network administrators to back-trace an IPv6 address to a specific MAC address, since the temporary IP address used once by the node is removed from the interface after a period of time. An IPv6 Ethernet test bed is setup to investigate IPv6 implementation dynamics in Windows 7 and Ubuntu10.04 operating systems. The testbed is extended to investigate the effects of temporary IPv6 addresses due to IPv6 privacy extensions on the on-going sessions of different applications including ping, File Transfer Protocol (FTP) and video streaming (HTTP and RTP). On the basis of the knowledge obtained from investigations about dynamics of IPv6 privacy extensions, this work proposes Internet Protocol version 6 Host Tracking (IPv6HoT), a web based IPv6 to MAC mapping solution. IPv6HoT uses Simple Network Management Protocol (SNMP) to forward IPv6 Neighbor table from routers to Network Management Stations (NMS). This thesis work provides guidelines for configuring IPv6 privacy extensions in Ubuntu10.04 and Windows 7; the difference of implementation between these two operating systems is also presented in this work. The results show that temporary IPv6 addressing has a definite effect on the on-going sessions of video streaming and FTP applications. Applications running as server on Temporary IPv6 address encountered more frequent on-going session interruptions than applications running as a server over public IPv6 address. When temporary IPv6 addresses were configured to host FTP and video streaming applications, their on-going sessions were permanently interrupted. It is also observed that LFTP, a client FTP application, resumes an interrupted session.

  • 175.
    Ayoubi, Tarek
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Distributed Data Management Supporting Healthcare Workflow from Patients’ Point of View2007Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Patient’s mobility throughout his lifetime leaves a trial of information scattered in laboratories, clinical institutes, primary care units, and other hospitals. Hence, the medical history of a patient is valuable when subjected to special healthcare units or undergoes home-care/personal-care in elderly stage cases. Despite the rhetoric about patient-centred care, few attempts were made to measure and improve in this arena. In this thesis, we will describe and implement a high-level view of a Patient Centric information management, deploying at a preliminary stage, the use of Agent Technologies and Grid Computing. Thus, developing and proposing an infrastructure that allows us to monitor and survey the patient, from the doctor’s point of view, and investigate a Persona, from the patients’ side, that functions and collaborates among different medical information structures. The Persona will attempt to interconnect all the major agents (human and software), and realize a distributed grid info-structure that directly affect the patient, therefore, revealing an adequate and cost-effective solution for most critical information needs. The results comprehended in the literature survey, consolidating Healthcare Information Management with emerged intelligent Multi-Agent System Technologies (MAS) and Grid Computing; intends to provide a solid basis for further advancements and assessments in this field, by bridging and proposing a framework between the home-care sector and the flexible agent architecture throughout the healthcare domain.

  • 176.
    Ayub, Yasir
    et al.
    Blekinge Institute of Technology, School of Computing.
    Faruki, Usman
    Blekinge Institute of Technology, School of Computing.
    Container Terminal Operations Modeling through Multi agent based Simulation2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    This thesis aims to propose a multi-agent based hierarchical model for the operations of container terminals. We have divided our model into four key agents that are involved in each sub processes. The proposed agent allocation policies are recommended for different situations that may occur at a container terminal. A software prototype is developed which implements the hierarchical model. This web based application is used in order to simulate the various processes involved in the following operations on the marine side in a case study of a container terminal in Sweden by adopting a multi-agent based simulation technique. Due to the increase in usage of container transportation, container terminals are experiencing difficulties in the management of the operations. The software provides a decision support capability to terminal managers for scheduling and managing the operations effectively while also visually presenting the time it takes to complete the process and its associated cost. Terminal managers need to implement certain policies to improve the management and operations of the container terminal. The policies are evaluated and tested under various cases to provide a more comparative overview. The results of the simulation experiments indicate that the waiting time for arriving vessels is decreasing when in queue with more than three vessels arriving on same day.

  • 177.
    Azam, Muhammad
    et al.
    Blekinge Institute of Technology, School of Computing.
    Ahmad, Luqman
    Blekinge Institute of Technology, School of Computing.
    A Comparative Evaluation of Usability for the iPhone and iPad2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Many everyday systems and products seem to be designed with little regard to usability. This leads to the frustration, wasted time and errors. So the usability of the product is important for its survival in the market. In many previous studies the usability evaluation of the iPhone and iPad carried out individually and very little work has been done on the comparative usability evaluation. However, there was not any study conducted on the comparative usability evaluation and measuring the performance of the iPhone versus iPad in a controlled environment. In this research work, the authors performed the comparative usability evaluation and measured the performances of the iPhone and iPad on the selected applications by considering the young users as well as the elderly users. Another objective of this study is to identify the usability issues in performances of the iPhone and iPad. A survey and experiment techniques were used to achieve the dened objectives. The survey questionnaire consisted of 42 statements that presented the different usability aspects. The objectives of the survey study were to validate the identified issues from the literature study, identify new issues and measure the signicant difference in user opinions for the iPhone and iPad. However, the experiment studies helped to measure the performance significances between the devices against the three user groups (novice user, experienced user, elderly user) and among the groups over the devices. Further, objective was to measure the satisfaction level of the participated users against the iPhone and iPad. The experiment was performed in a controlled environment. Total six tasks (two tasks per application) were dened and each participant performed the same task on both devices. Generally the authors found that the participants performed better on the iPad with lower error rates as compare to the iPhone.

  • 178.
    Azam, Muhammad
    et al.
    Blekinge Institute of Technology, School of Computing.
    Hussain, Izhar
    Blekinge Institute of Technology, School of Computing.
    The Role of Interoperability in eHealth2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In the light of challenges the lack of interoperability in systems and services has long been recognized as one of the major challenge to the wider implementation of the eHealth applications. The opportunities and positive benefits of achieving interoperability are eventually considerable, whereas various barriers and challenges act as impediments. The purpose of this study was to investigate the interoperability among different health care organizations. The knowledge of this study would be supportive to health care organizations to understand the interoperability problems in health care organizations. In the first phase of literature review interoperability challenges in Sweden and other EU countries were identified. On the basis of findings interviews were conducted to know the strategies and planning about interoperability in health care organizations. After analysis of interviews, questionnaires were conducted to know the opinions of different medical IT administrator and health professionals. The authors find after the analysis of interviews and questionnaire that adopting eHealth standard, same system, insuring the security of patient’s health record information and same medical language could be implemented in Sweden and other EU countries health organizations.

  • 179.
    Azhar, Muhammad Saad Bin
    et al.
    Blekinge Institute of Technology, School of Computing.
    Aslam, Ammad
    Blekinge Institute of Technology, School of Computing.
    Multiple Coordinated Information Visualization Techniques in Control Room Environment2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Presenting large amount of Multivariate Data is not a simple problem. When there are multiple correlated variables involved, it becomes difficult to comprehend data using traditional ways. Information Visualization techniques provide an interactive way to present and analyze such data. This thesis has been carried out at ABB Corporate Research, Västerås, Sweden. Use of Parallel Coordinates and Multiple Coordinated Views was has been suggested to realize interactive reporting and trending of Multivariate Data for ABB’s Network Manager SCADA system. A prototype was developed and an empirical study was conducted to evaluate the suggested design and test it for usability from an actual industry perspective. With the help of this prototype and the evaluations carried out, we are able to achieve stronger results regarding the effectiveness and efficiency of the visualization techniques used. The results confirm that such interfaces are more effective, efficient and intuitive for filtering and analyzing Multivariate Data.

  • 180.
    Aziz, Hussein
    Blekinge Institute of Technology, School of Computing.
    Streaming Video over Unreliable and Bandwidth Limited Networks2013Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    The main objective of this thesis is to provide a smooth video playout on the mobile device over wireless networks. The parameters that specify the wireless channel include: bandwidth variation, frame losses, and outage time. These parameters may affect the quality of the video negatively, and the mobile users may notice sudden stops during the playout video, i.e., the picture is momentarily frozen, followed by a jump from one scene to a different one. This thesis focuses on eliminating frozen pictures and reducing the amount of video data that need to be transmitted. In order to eliminate frozen scenes on the mobile screen, we propose three different techniques. In the first technique, the video frames are split into sub-frames; these sub-frames are streamed over different channels. In the second technique the sub-frames will be “crossed” and sent together with other sub-frames that are from different positions in the streaming video sequence. If some sub-frames are lost during the transmission a reconstruction mechanism will be applied on the mobile device to recreate the missing sub-frames. In the third technique, we propose a Time Interleaving Robust Streaming (TIRS) technique to stream the video frames in different order. The benefit of that is to avoid losing a sequence of neighbouring frames. A missing frame from the streaming video will be reconstructed based on the surrounding frames on the mobile device. In order to reduce the amount of video data that are streamed over limited bandwidth channels, we propose two different techniques. These two techniques are based on identifying and extracting a high motion region of the video frames. We call this the Region Of Interest (ROI); the other parts of the video frames are called the non-Region Of Interest (non-ROI). The ROI is transmitted with high quality, whereas the non-ROI is interpolated from a number of references frames. In the first technique the ROI is a fixed size region; we considered four different types of ROI and three different scenarios. The scenarios are based on the position of the reference frames in the streaming frame sequence. In the second technique the ROI is identified based on the motion in the video frames, therefore the size, position, and shape of the ROI will be different from one video to another according to the video characteristic. The videos are coded using ffmpeg to study the effect of the proposed techniques on the encoding size. Subjective and objective metrics are used to measure the quality level of the reconstructed videos that are obtained from the proposed techniques. Mean Opinion Score (MOS) measurements are used as a subjective metric based on human opinions, while for objective metric the Structural Similarity (SSIM) index is used to compare the similarity between the original frames and the reconstructed frames.

  • 181.
    Aziz, Hussein Muzahim
    et al.
    Blekinge Institute of Technology, School of Computing.
    Fiedler, Markus
    Blekinge Institute of Technology, School of Computing.
    Grahn, Håkan
    Blekinge Institute of Technology, School of Computing.
    Lundberg, Lars
    Blekinge Institute of Technology, School of Computing.
    Compressing Video Based on Region of Interest2013Conference paper (Refereed)
    Abstract [en]

    Real-time video streaming suffer from bandwidth limitation that are unable to handle the high amount of video data. To reduce the amount of data to be streamed, we propose an adaptive technique to crop the important part of the video frames, and drop the part that are outside the important part; this part is called the Region of Interest (ROI). The Sum of Absolute Differences (SAD) is computed to the consecutive video frames on the server side to identify and extract the ROI. The ROI are extracted from the frames that are between reference frames based on three scenarios. The scenarios been designed to position the reference frames in the video frames sequence. Linear interpolation is performed from the reference frames to reconstruct the part that are outside the ROI on the mobile side. We evaluate the proposed approach for the three scenarios by looking at the size of the compressed videos and measure the quality of the videos by using the Mean Opinion Score (MOS). The results show that our technique significantly reduces the amount of data to be streamed over wireless networks with acceptable video quality are provided to the mobile viewers.

  • 182. Aziz, Hussein Muzahim
    et al.
    Fiedler, Markus
    Blekinge Institute of Technology, School of Computing.
    Grahn, Håkan
    Blekinge Institute of Technology, School of Computing.
    Lundberg, Lars
    Blekinge Institute of Technology, School of Computing.
    Eliminating the Effects of Freezing Frames on User Perceptive by Using a Time Interleaving Technique2012In: Multimedia Systems, ISSN 0942-4962, E-ISSN 1432-1882, Vol. 18, no 3, p. 251-262Article in journal (Refereed)
    Abstract [en]

    Streaming video over a wireless network faces several challenges such as high packet error rates, bandwidth variations, and delays, which could have negative effects on the video streaming and the viewer will perceive a frozen picture for certain durations due to loss of frames. In this study, we propose a Time Interleaving Robust Streaming (TIRS) technique to significantly reduce the frozen video problem and provide a satisfactory quality for the mobile viewer. This is done by reordering the streaming video frames as groups of even and odd frames. The objective of streaming the video in this way is to avoid the losses of a sequence of neighbouring frames in case of a long sequence interruption. We evaluate our approach by using a user panel and mean opinion score (MOS) measurements; where the users observe three levels of frame losses. The results show that our technique significantly improves the smoothness of the video on the mobile device in the presence of frame losses, while the transmitted data are only increased by almost 9% (due to reduced time locality).

  • 183. Aziz, Hussein Muzahim
    et al.
    Fiedler, Markus
    Grahn, Håkan
    Lundberg, Lars
    Streaming Video as Space – Divided Sub-Frames over Wireless Networks2010Conference paper (Refereed)
    Abstract [en]

    Real time video streaming suffers from lost, delayed, and corrupted frames due to the transmission over error prone channels. As an effect of that, the user may notice a frozen picture in their screen. In this work, we propose a technique to eliminate the frozen video and provide a satisfactory quality to the mobile viewer by splitting the video frames into sub- frames. The multiple descriptions coding (MDC) is used to generate multiple bitstreams based on frame splitting and transmitted over multichannels. We evaluate our approach by using mean opinion score (MOS) measurements. MOS is used to evaluate our scenarios where the users observe three levels of frame losses for real time video streaming. The results show that our technique significantly improves the video smoothness on the mobile device in the presence of frame losses during the transmission.

  • 184. Aziz, Hussein Muzahim
    et al.
    Grahn, Håkan
    Lundberg, Lars
    Eliminating the Freezing Frames for the Mobile User over Unreliable Wireless Networks2009Conference paper (Refereed)
    Abstract [en]

    The main challenge of real time video streaming over a wireless network is to provide good quality service (QoS) to the mobile viewer. However, wireless networks have a limited bandwidth that may not be able to handle the continues video frame sequence and also with the possibility that video frames could be dropped or corrupted during the transmission. This could severely affect the video quality. In this study we come up with a mechanism to eliminate the frozen video and provide a quality satisfactory for the mobile viewer. This can be done by splitting the video frames to sub-frame and transmitted over multiple channels. We will present a subjective test, the Mean Opinion Score (MOS). MOS is used to evaluate our scenarios where the users can observe three levels of frame losses for real time video streaming. The results for our technique significantly improves the indicate perceived that video quality.

  • 185. Aziz, Hussein Muzahim
    et al.
    Grahn, Håkan
    Lundberg, Lars
    Sub-Frame Crossing for Streaming Video over Wireless Networks2010Conference paper (Refereed)
    Abstract [en]

    Transmitting a real time video streaming over a wireless network cannot guarantee that all the frames could be received by the mobile devices. The characteristics of a wireless network in terms of the available bandwidth, frame delay, and frame losses cannot be known in advanced. In this work, we propose a new mechanism for streaming video over a wireless channel. The proposed mechanism prevents freezing frames in the mobile devices. This is done by splitting the video frame in two sub-frames and combines them with another sub-frame from different sequence position in the streaming video. In case of lost or dropped frame, there is still a possibility that another half (sub-frame) will be received by the mobile device. The receiving sub-frames will be reconstructed to its original shape. A rate adaptation mechanism will be also highlight in this work. We show that sever can skip up to 50% of the sub-frames and we can still be able to reconstruct the receiving sub-frame and eliminate the freezing picture in the mobile device.

  • 186. Aziz, Hussein Muzahim
    et al.
    Lundberg, Lars
    Graceful degradation of mobile video quality over wireless network2009Conference paper (Refereed)
    Abstract [en]

    Real-time video transmission over wireless channels has become an important topic in wireless communication because of the limited bandwidth of wireless network that should handle high amount of video frames. Video frames must arrive at the client before the playout time with enough time to display the contents of the frames. Real-time video transmission is particularly sensitive to delay as it has a strict bounded end-to-end delay constraint; video applications impose stringent requirements on communication parameters, such as frame lost and frame dropped due to excessive delay are the primary factors affecting the user-perceived quality. In this study we investigate ways of obtaining a graceful and controlled degradation of the quality, by introducing redundancy in the frame sequence and compensating this by limiting colourcoding and resolution. The effect of that is to use double streaming mechanism, in this way we will obtain less freezing at the expense of limited colours and resolution. Our experiments, applied to scenarios where users can observe three types of dropping load for real time video streaming, the analytical measurements tools are used in this study to evaluate the video quality is the mean opinion score and we will demonstrate this and argue that the proposed technique improves the use perceived of the video quality.

  • 187. Aziz, Maryam
    et al.
    Masum, M. E.
    Babu, M. J.
    Rahman, Suhaimi Ab
    Nordberg, Jörgen
    Blekinge Institute of Technology, School of Computing.
    Mobility impact on the end-to-end delay performance for VoIP over LTE2012In: Procedia Engineering, Coimbatore: Elsevier , 2012, Vol. 30, p. 491-498Conference paper (Refereed)
    Abstract [en]

    Long Term Evolution (LTE) is the last step towards the 4th generation of cellular networks. This revolution is necessitated by the unceasing increase in demand for high speed connection on LTE networks. This paper focuses on the performance evaluation of End-to-End delay under variable mobility speed for VoIP (Voice over IP) in the LTE network. In the course of E2E performance evaluation, realizing simulation approach three scenarios have been modeled using OPNET 16.0. The first one is the baseline network while among other two, one consists of VoIP traffic solely and the other consists of FTP along with VoIP. E2E delay has been measured for both scenarios in various cases under the varying mobility speed of the node. Simulation results have been studied and presented in terms of comparative performance analysis of the three network scenarios. In light of the result analysis, the performance quality of a VoIP network (with and without the presence of additional network traffic) in LTE has been determined and discussed. The simulation results for baseline VoIP network (non-congested) congested VoIP network and congested VoIP with FTP network show that as the speed of node is gradually increased, E2E delay slightly increases.

  • 188.
    Aziz, Md. Tariq
    et al.
    Blekinge Institute of Technology, School of Computing.
    Islam, Mohammad Saiful
    Blekinge Institute of Technology, School of Computing.
    Performance Evaluation of Real–Time Applications over DiffServ/MPLS in IPv4/IPv6 Networks2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Over the last years, we have witnessed a rapid deployment of real-time applications on the Internet as well as many research works about Quality of Service (QoS) in particularly IPv4 (Internet Protocol version 4). The inevitable exhaustion of the remaining IPv4 address pool has become progressively evident. As the evolution of Internet Protocol (IP) continues, the deployment of IPv6 QoS is underway. Today, there is limited experience in the deployment of QoS for IPv6 traffic in MPLS backbone networks in conjunction with DiffServ (Differentiated Services) support. DiffServ itself does not have the ability to control the traffic which has been taken for end-to-end path while a number of links of the path are congested. In contrast, MPLS Traffic Engineering (TE) is accomplished to control the traffic and can set up end-to-end routing path before data has been forwarded. From the evolution of IPv4 QoS solutions, we know that the integration of DiffServ and MPLS TE satisfies the guaranteed QoS requirement for real-time applications. This thesis presents a QoS performance study of real-time applications such as voice and video conferencing over DiffServ with or without MPLS TE in IPv4/IPv6 networks using Optimized Network Engineering Tool (OPNET). This thesis also studies the interaction of Expedited Forwarding (EF), Assured Forwarding (AF) traffic aggregation, link congestion, as well as the effect of various performance metrics such as packet end-to-end delay, packet delay variation, queuing delay, throughput and packet loss. The effectiveness of DiffServ and MPLS TE integration in IPv4/IPv6 network is illustrated and analyzed. The thesis shows that IPv6 experiences more delay and loss performance than their IPv4 counterparts.

  • 189.
    AZIZ, YASSAR
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    ASLAM, MUHAMMAD NAEEM
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Traffic Engineering with Multi-Protocol Label Switching, Performance Comparison with IP networks2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Traffic Engineering (TE) is the stage which deals with geometric design planning and traffic operation of networks, network devices and relationship of routers for the transportation of data. TE is that feature of network engineering which concentrate on problems of performance optimization of operational networks. It involves techniques and application of knowledge to gain performance objectives, which includes movement of data through network, reliability, planning of network capacity and efficient use of network resources. This thesis addresses the problems of traffic engineering and suggests a solution by using the concept of Multi-Protocol Label Switching (MPLS). We have done simulation in Matlab environment to compare the performance of MPLS against the IP network in a simulated environment. MPLS is a modern technique for forwarding network data. It broadens routing according to path controlling and packet forwarding. In this thesis MPLS is computed on the basis of its performance, efficiency for sending data from source to destination. A MATLAB based simulation tool is developed to compare MPLS with IP network in a simulated environment. The results show the performance of MPLS network in comparison of IP network.

  • 190.
    Babaeeghazvini, Parinaz
    Blekinge Institute of Technology, School of Engineering.
    EEG enhancement for EEG source localization in brain-machine speller2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    A Brain-Computer Interface (BCI) is a system to communicate with external world through the brain activity. The brain activity is measured by Electro-Encephalography (EEG) and then processed by a BCI system. EEG source reconstruction could be a way to improve the accuracy of EEG classification in EEGbased brain–computer interface (BCI). In this thesis BCI methods were applied on derived sources which by their EEG enhancement it became possible to obtain a more accurate EEG detection and brought a new application to BCI technology that are recognition of writing letters imagery from brain waves. The BCI system enables people to write and type letters by their brain activity (EEG). To this end, first part of the thesis is dedicated to EEG source reconstruction techniques to select the most optimal EEG channels for task classification purposes. Due to this reason the changes in EEG signal power from rest state to motor imagery task was used, to find the location of an active single equivalent dipole. Implementing an inverse problem solution on the power changes by Multiple Sparse Priors (MSP) method generated a scalp map where its fitting showed the localization of EEG electrodes. Having the optimized locations the secondary objective was to choose the most optimal EEG features and rhythm for an efficient classification. This became possible by feature ranking, 1- Nearest Neighbor leave-one-out. The feature vectors were computed by applying the combined methods of multitaper method, Pwelch. The features were classified by several methods of Normal densities based quadratic classifier (qdc), k-nearest neighbor classifier (knn), Mixture of Gaussians classification and Train neural network classifier using back-propagation. Results show that the selected features and classifiers are able to recognize the imagination of writing alphabet with the high accuracy.

  • 191.
    Babar, Shahzad
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mehmood, Aamer
    Blekinge Institute of Technology, School of Computing.
    Enhancing Accessibility of Web Based GIS Applications through User Centered Design2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Web Accessibility emerged as problem when disabled and elder people started interaction with web contents soon after the inception of World Wide Web. When web based GIS applications appeared on the scene of web and users of these kinds of applications increased, these applications faced the similar problem of accessibility. The intensity of web accessibility problems in GIS based applications has increased rapidly during recent years due to extensive interaction of user with maps. Web Accessibility problems faced by users of GIS applications are identified by content evaluation and user interaction. Users are involved in identification of accessibility problems because guidelines and automated tools are not sufficient for that purpose. User Centered Approach is used to include users in the development process and this has also helped in identification of accessibility problems of the users at early stages. The thesis report identify the accessibility issues in Web based GIS application by content evaluation and user interaction evaluation. MapQuest, a web based GIS application, is taken as a case study to identify the web accessibility problems in GIS applications. This thesis report has also studied that how accessibility of the web based GIS applications can be enhanced by using UCD approach in development process of GIS applications.

  • 192. Baca, Dejan
    Automated static code analysis: A tool for early vulnerability detection2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software vulnerabilities are added into programs during its development. Architectural flaws are introduced during planning and design, while implementation faults are created during coding. Penetration testing is often used to detect these vulnerabilities. This approach is expensive because it is performed late in development and any correction would increase lead-time. An alternative would be to detect and correct vulnerabilities in the phase of development where they are the least expensive to correct and detect. Source code audits have often been suggested and used to detect implementations vulnerabilities. However, manual audits are time consuming and require extended expertise to be efficient. A static code analysis tool could achieve the same results as a manual audit but at fraction of the time. Through a set of cases studies and experiments at Ericsson AB, this thesis investigates the technical capabilities and limitations of using a static analysis tool as an early vulnerability detector. The investigation is extended to studying the human factor by examining how the developers interact and use the static analysis tool. The contributions of this thesis include the identification of the tools capabilities so that further security improvements can focus on other types of vulnerabilities. By using static analysis early in development possible cost saving measures are identified. Additionally, the thesis presents the limitations of static code analysis. The most important limitation being the incorrect warnings that are reported by static analysis tools. In addition, a development process overhead was deemed necessary to successfully use static analysis in an industry setting.

  • 193.
    Baca, Dejan
    Blekinge Institute of Technology, School of Computing.
    Developing Secure Software: in an Agile Process2012Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Background: Software developers are facing increased pressure to lower development time, release new software versions more frequent to customers and to adapt to a faster market. This new environment forces developers and companies to move from a plan based waterfall development process to a flexible agile process. By minimizing the pre development planning and instead increasing the communication between customers and developers, the agile process tries to create a new, more flexible way of working. This new way of working allows developers to focus their efforts on the features that customers want. With increased connectability and the faster feature release, the security of the software product is stressed. To develop secure software, many companies use security engineering processes that are plan heavy and inflexible. These two approaches are each others opposites and they directly contradict each other. Objective: The objective of the thesis is to evaluate how to develop secure software in an agile process. In particular, what existing best practices can be incorporated into an agile project and still provide the same benefit if the project was using a waterfall process. How the best practices can be incorporated and adapted to fit the process while still measuring the improvement. Some security engineering concepts are useful but the best practice is not agile compatible and would require extensive adaptation to integrate with an agile project. Method: The primary research method used throughout the thesis is case studies conducted in a real industry setting. As secondary methods for data collection a variety of approaches have been used, such as semi-structured interviews, workshops, study of literature, and use of historical data from the industry. Results: The security engineering best practices were investigated though a series of case studies. The base agile and security engineering compatibility was assessed in literature, by developers and in practical studies. The security engineering best practices were group based on their purpose and their compatibility with the agile process. One well known and popular best practice, automated static code analysis, was toughly investigated for its usefulness, deployment and risks of using as part of the process. For the risk analysis practices, a novel approach was introduced and improved. As such, a way of adapting existing practices to agile is proposed. Conclusion: With regard of agile and security engineering we did not find that any of the investigated processes was agile compatible. Agile is reaction driven that adapts to change, while the security engineering processes are proactive and try to prevent threats before they happen. To develop secure software in an agile process the developers should adopt and adapt key concepts from security engineering. These changes will affect the flexibility of the agile process but it is a necessity if developers want the same software security state as security engineering processes can provide.

  • 194. Baca, Dejan
    Identifying Security Relevant Warnings from Static Code Analysis Tools through Code Tainting2010Conference paper (Refereed)
    Abstract [en]

    Static code analysis tools are often used by developers as early vulnerability detectors. Due to their automation they are less time-consuming and error-prone then manual reviews. However, they produce large quantities of warnings that developers have to manually examine and understand. In this paper, we look at a solution that makes static code analysis tools more useful as an early vulnerability detector. We use flow-sensitive, interprocedural and context-sensitive data flow analysis to determine the point of user input and its migration through the source code to the actual exploit. By determining a vulnerabilities point of entry we lower the number of warnings a tool produces and we provide the developer with more information why this warning could be a real security threat. We use our approach in three different ways depending on what tool we examined. First,With the commercial static code analysis tool, Coverity, we reanalyze its results and create a set of warnings that are specifically relevant from a security perspective. Secondly, we altered the open source analysis tool Findbugs to only analyze code that has been tainted by user input. Third, we created an own analysis tool that focuses on XSS vulnerabilities in Java code.

  • 195. Baca, Dejan
    et al.
    Boldt, Martin
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Carlsson, Bengt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Jacobsson, Andreas
    A Novel Security-Enhanced Agile Software Development Process Applied in an Industrial Setting2015In: Proceedings 10th International Conference on Availability, Reliability and Security ARES 2015, IEEE Computer Society Digital Library, 2015Conference paper (Refereed)
    Abstract [en]

    A security-enhanced agile software development process, SEAP, is introduced in the development of a mobile money transfer system at Ericsson Corp. A specific characteristic of SEAP is that it includes a security group consisting of four different competences, i.e., security manager, security architect, security master and penetration tester. Another significant feature of SEAP is an integrated risk analysis process. In analyzing risks in the development of the mobile money transfer system, a general finding was that SEAP either solves risks that were previously postponed or solves a larger proportion of the risks in a timely manner. The previous software development process, i.e., the baseline process of the comparison outlined in this paper, required 2.7 employee hours spent for every risk identified in the analysis process compared to, on the average, 1.5 hours for the SEAP. The baseline development process left 50% of the risks unattended in the software version being developed, while SEAP reduced that figure to 22%. Furthermore, SEAP increased the proportion of risks that were corrected from 12.5% to 67.1%, i.e., more than a five times increment. This is important, since an early correction may avoid severe attacks in the future. The security competence in SEAP accounts for 5% of the personnel cost in the mobile money transfer system project. As a comparison, the corresponding figure, i.e., for security, was 1% in the previous development process.

  • 196. Baca, Dejan
    et al.
    Carlsson, Bengt
    Lundberg, Lars
    Evaluating the Cost Reduction of Static Code Analysis for Software Security2008Conference paper (Refereed)
    Abstract [en]

    Automated static code analysis is an efficient technique to increase the quality of software during early development. This paper presents a case study in which mature software with known vul-nerabilities is subjected to a static analysis tool. The value of the tool is estimated based on reported failures from customers. An average of 17% cost savings would have been possible if the static analysis tool was used. The tool also had a 30% success rate in detecting known vulnerabilities and at the same time found 59 new vulnerabilities in the three examined products.

  • 197. Baca, Dejan
    et al.
    Petersen, Kai
    Carlsson, Bengt
    Lundberg, Lars
    Static Code Analysis to Detect Software Security Vulnerabilities: Does Experience Matter?2009Conference paper (Refereed)
    Abstract [en]

    Code reviews with static analysis tools are today recommended by several security development processes. Developers are expected to use the tools' output to detect the security threats they themselves have introduced in the source code. This approach assumes that all developers can correctly identify a warning from a static analysis tool (SAT) as a security threat that needs to be corrected. We have conducted an industry experiment with a state of the art static analysis tool and real vulnerabilities. We have found that average developers do not correctly identify the security warnings and only developers with specific experiences are better than chance in detecting the security vulnerabilities. Specific SAT experience more than doubled the number of correct answers and a combination of security experience and SAT experience almost tripled the number of correct security answers.

  • 198.
    Bachu, Rajesh
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    A framework to migrate and replicate VMware Virtual Machines to Amazon Elastic Compute Cloud: Performance comparison between on premise and the migrated Virtual Machine2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context Cloud Computing is the new trend in the IT industry. Traditionally obtaining servers was quiet time consuming for companies. The whole process of research on what kind of hardware to buy, get budget approval, purchase the hardware and get access to the servers could take weeks or months. In order to save time and reduce expenses, most companies are moving towards the cloud. One of the known cloud providers is Amazon Elastic Compute Cloud (EC2). Amazon EC2 makes it easy for companies to obtain virtual servers (known as computer instances) in a cloud quickly and inexpensively. Another advantage of using Amazon EC2 is the flexibility that they offer, so the companies can even import/export the Virtual Machines (VM) that they have built which meets the companies IT security, configuration, management and compliance requirements into Amazon EC2.

    Objectives In this thesis, we investigate importing a VM running on VMware into Amazon EC2. In addition, we make a performance comparison between a VM running on VMware and the VM with same image running on Amazon EC2.

    Methods A Case study research has been done to select a persistent method to migrate VMware VMs to Amazon EC2. In addition an experimental research is conducted to measure the performance of Virtual Machine running on VMware and compare it with same Virtual Machine running on EC2. We measure the performance in terms of CPU, memory utilization as well as disk read/write speed using well-known open-source benchmarks from Phoronix Test Suite (PTS).

    Results Investigation on importing VM snapshots (VMDK, VHD and RAW format) to EC2 was done using three methods provided by AWS. Comparison of performance was done by running each benchmark for 25 times on each Virtual Machine.

    Conclusions Importing VM to EC2 was successful only with RAW format and replication was not successful as AWS installs some software and drivers while importing the VM to EC2. Migrated EC2 VM performs better than on premise VMware VM in terms of CPU, memory utilization and disk read/write speed.

  • 199. Bachu, Yashwanth
    Packaging Demand Forecasting in Logistics using Deep Neural Networks2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Background: Logistics have a vital role in supply chain management and those logistics operations are dependent on the availability of packaging material for packing goods and material to be shipped. Forecasting packaging material demand for a long period of time will help organization planning to meet the demand. Using time-series data with Deep Neural Networks for long term forecasting is proposed for research. Objectives: This study is to identify the DNN used in forecasting packaging demand and in similar problems in terms of data, data similar to the available data with the organization (Volvo). Identifying the best-practiced approach for long-term forecasting and then combining the approach with identified and selected DNN for forecasting. The end objective of the thesis is to suggest the best DNN model for packaging demand forecasting. Methods: An experiment is conducted to evaluate the DNN models selected for demand forecasting. Three models are selected by a preliminary systematic literature review. Another Systematic literature review is performed in parallel for identifying metrics to evaluate the models to measure performance. Results from the preliminary literature review were instrumental in performing the experiment. Results: Three models observed in this study are performing well with considerable forecasting values. But based on the type and amount of historical data that models were given to learn, three models have a very slight difference in performance measures in terms of forecasting performance. Comparisons are made with different measures that are selected by the literature review. For a better understanding of the batch size impact on model performance, experimented three models were developed with two different batch sizes. Conclusions: Proposed models are performing considerable forecasting of packaging demand for planning the next 52 weeks (∼ 1 Year). Results show that by adopting DNN in forecasting, reliable packaging demand can be forecasted on time series data for packaging material. The combination of CNN-LSTM is better performing than the respective individual models by a small margin. By extending the forecasting at the granule level of the supply chain (Individual suppliers and plants) will benefit the organization by controlling the inventory and avoiding excess inventory.

  • 200.
    Bahrieh, Sara
    Blekinge Institute of Technology, School of Engineering.
    Sensor Central / Automotive Systems2013Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    How to display objects which were detected from different devices in one coordinate system? Nowadays most vehicles are equipped with front and back sensors to help the driver in driving process. Companies who provide this technology need to have an application which enables them for easy data fusion from these sensors and recording the process. Besides sensor’s design, programming of them is an important aspect. BASELABS Connect has the solution in a user friendly way. Creating Sensor Central component for BASELABS Connect is the main goal of this thesis. Sensor Central from BASELABS Connect requires six variables of sensor’s position for each sensor to demonstrate the objects from all sensors to one unique coordinate system. In this thesis, it was intended to create such a component which was mounted between all the sensors and the charting component to convert the objects location from different sensor’s position to one coordinate system and to be usable from other vehicles too.

1234567 151 - 200 of 2535
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf