Change search
Refine search result
1234567 1 - 50 of 3243
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Abari, Farzad Foroughi
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Optimization of Audio Processing algorithms (Reverb) on ARMv6 family of processors2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Audio processing algorithms are increasingly used in cell phones and today’s customers are placing more demands on cell phones. Feature phones, once the advent of mobile phone technology, nowadays do more than just providing the user with MP3 play back or advanced audio effects. These features have become an integral part of medium as well as low-end phones. On the other hand, there is also an endeavor to include as improved quality as possible into products to compete in market and satisfy users’ needs. Tackling the above requirements has been partly satisfied by the advance in hardware design and manufacturing technology. However, as new hardware emerges into market the need for competence to write efficient software and exploit the new features thoroughly and effectively arises. Even though compilers are also keeping up with the new tide space for hand optimized code still exist. Wrapped in the above goal, an effort was made in this thesis to partly cover the competence requirement at Multimedia Section (part of Ericsson Mobile Platforms) to develope optimized code for new processors. Forging persistently ahead with new products, EMP has always incorporated the latest technology into its products among which ARMv6 family of processors has the main central processing role in a number of upcoming products. To fully exploit latest features provided by ARMv6, it was required to probe its new instruction set among which new media processing instructions are of outmost importance. In order to execute DSP-intensive algorithms (e.g. Audio Processing algorithms) efficiently, the implementation should be done in low-level code applying available instruction set. Meanwhile, ARMv6 comes with a number of new features in comparison with its predecessors. SIMD (Single Instruction Multiple Data) and VFP (Vector Floating Point) are the most prominent media processing improvements in ARMv6. Aligned with thesis goals and guidelines, Reverb algorithm which is among one of the most complicated audio features on a hand-held devices was probed. Consequently, its kernel parts were identified and implementation was done both in fixed-point and floating-point using the available resources on hardware. Besides execution time and amount of code memory for each part were measured and provided in tables and charts for comparison purposes. Conclusions were finally drawn based on developed code’s efficiency over ARM compiler’s as well as existing code already developed and tailored to ARMv5 processors. The main criteria for optimization was the execution time. Moreover, quantization effect due to limited precision fixed-point arithmetic was formulated and its effect on quality was elaborated. The outcomes, clearly indicate that hand optimization of kernel parts are superior to Compiler optimized alternative both from the point of code memory as well as execution time. The results also confirmed the presumption that hand optimized code using new instruction set can improve efficiency by an average 25%-50% depending on the algorithm structure and its interaction with other parts of audio effect. Despite its many draw backs, fixed-point implementation remains yet to be the dominant implementation for majority of DSP algorithms on low-power devices.

    Download full text (pdf)
    FULLTEXT01
  • 2.
    Abdeen, Waleed
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Chen, Xingru
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Unterkalmsteiner, Michael
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    An approach for performance requirements verification and test environments generation2023In: Requirements Engineering, ISSN 0947-3602, E-ISSN 1432-010X, Vol. 28, no 1, p. 117-144Article in journal (Refereed)
    Abstract [en]

    Model-based testing (MBT) is a method that supports the design and execution of test cases by models that specify theintended behaviors of a system under test. While systematic literature reviews on MBT in general exist, the state of the arton modeling and testing performance requirements has seen much less attention. Therefore, we conducted a systematic map-ping study on model-based performance testing. Then, we studied natural language software requirements specificationsin order to understand which and how performance requirements are typically specified. Since none of the identified MBTtechniques supported a major benefit of modeling, namely identifying faults in requirements specifications, we developed thePerformance Requirements verificatiOn and Test EnvironmentS generaTion approach (PRO-TEST). Finally, we evaluatedPRO-TEST on 149 requirements specifications. We found and analyzed 57 primary studies from the systematic mappingstudy and extracted 50 performance requirements models. However, those models don’t achieve the goals of MBT, whichare validating requirements, ensuring their testability, and generating the minimum required test cases. We analyzed 77 Soft-ware Requirements Specification (SRS) documents, extracted 149 performance requirements from those SRS, and illustratethat with PRO-TEST we can model performance requirements, find issues in those requirements and detect missing ones.We detected three not-quantifiable requirements, 43 not-quantified requirements, and 180 underspecified parameters in the149 modeled performance requirements. Furthermore, we generated 96 test environments from those models. By modelingperformance requirements with PRO-TEST, we can identify issues in the requirements related to their ambiguity, measur-ability, and completeness. Additionally, it allows to generate parameters for test environments

    Download full text (pdf)
    fulltext
  • 3.
    Abdelsamad, Deena
    Blekinge Institute of Technology, School of Engineering.
    Video Transmission Jerkiness Measure2013Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Digital video transmission is widely used nowadays in multimedia. Frame dropping, freeze and reduced number of frames in the transmitted video are common symptoms of bad transmission quality. In order to assess the quality of transmission, a criterion is introduced in a model for a no reference video jerkiness measure [3]. This model is dierent from the former models presented as it depends on viewing conditions and video resolutions, so it is applicable for any frame size from QCIF to HD. The model uses simple mathematical equations of jerkiness and can be used for any video sequence [3]. A model of reduced reference method (Qtransmission) which depends on a pre-measured Jerkiness is introduced as a suggestion of future work.

    Download full text (pdf)
    FULLTEXT01
  • 4.
    Abdsharifi, Mohammad Hossein
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Dhar, Ripan Kumar
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Service Management for P2P Energy Sharing Using Blockchain – Functional Architecture2022Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Blockchain has become the most revolutionary technology in the 21st century. In recent years, one of the concerns of world energy isn't just sustainability yet, in addition, being secure and reliable also. Since information and energy security are the main concern for the present and future services, this thesis is focused on the challenge of how to trade energy securely on the background of using distributed marketplaces that can be applied. The core technology used in this thesis is distributed ledger, specifically blockchain. Since this technology has recently gained much attention because of its functionalities such as transparency, immutability, irreversibility, security, etc, we tried to convey a solution for the implementation of a secure peer-to-peer (P2P) energy trading network over a suitable blockchain platform. Furthermore, blockchain enables traceability of the origin of data which is called data provenience.

    In this work, we applied a secure blockchain technology in peer-to-peer energy sharing or trading system where the prosumer and consumer can trade their energies through a secure channel or network. Furthermore, the service management functionalities such as security, reliability, flexibility, and scalability are achieved through the implementation. \\

    This thesis is focused on the current proposals for p2p energy trading using blockchain and how to select a suitable blockchain technique to implement such a p2p energy trading network. In addition, we provide an implementation of such a secure network under blockchain and proper management functions. The choices of the system models, blockchain technology, and the consensus algorithm are based on literature review, and it carried to an experimental implementation where the feasibility of that system model has been validated through the output results. 

    Download full text (pdf)
    Service Management for P2P Energy Sharing Using Blockchain – Functional Architecture
  • 5.
    Abelsson, Sara
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Propagation Measurements at 3.5 GHz for WiMAX2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Propagation measurements at the frequency 3.5 GHz for the WiMAX technology have been conducted. The purpose of these measurements is that a coverage analysis should be accomplished. The mathematical software package MATLAB has been used to analyze the collected data from the measurement campaign. Path loss models have also been used and a comparison between these models and the collected data has been performed. An analysis prediction tool from an application called WRAP has also been used in the comparison with the collected data. In this thesis, diff

    Download full text (pdf)
    FULLTEXT01
  • 6.
    Abghari, Shahrooz
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    García Martín, Eva
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Johansson, Christian
    NODA Intelligent Systems AB, SWE.
    Lavesson, Niklas
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Grahn, Håkan
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Trend analysis to automatically identify heat program changes2017In: Energy Procedia, Elsevier, 2017, Vol. 116, p. 407-415Conference paper (Refereed)
    Abstract [en]

    The aim of this study is to improve the monitoring and controlling of heating systems located at customer buildings through the use of a decision support system. To achieve this, the proposed system applies a two-step classifier to detect manual changes of the temperature of the heating system. We apply data from the Swedish company NODA, active in energy optimization and services for energy efficiency, to train and test the suggested system. The decision support system is evaluated through an experiment and the results are validated by experts at NODA. The results show that the decision support system can detect changes within three days after their occurrence and only by considering daily average measurements.

    Download full text (pdf)
    fulltext
  • 7.
    Abualhana, Munther
    et al.
    Blekinge Institute of Technology, School of Computing.
    Tariq, Ubaid
    Blekinge Institute of Technology, School of Computing.
    Improving QoE over IPTV using FEC and Retransmission2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    IPTV (Internet Protocol Television), a new and modern concept of emerging technologies with focus on providing cutting edge high-resolution television, broadcast, and other fascinating services, is now easily available with only requirement of high-speed internet. Everytime a new technology is made local, it faces tremendous problems whether from technological point of view to enhance the performance or when it comes down to satisfy the customers. This cutting edge technology has provided researchers to embark and play with different tools to provide better quality while focusing on existing tools. Our target in dissertation is to provide a few interesting facets of IPTV and come up with a concept of introducing an imaginary cache that can re-collect the packets travelling from streaming server to the end user. In the access node this cache would be fixed and then on the basis of certain pre-assumed research work we can conclude how quick retransmission can take place when the end user responds back using RTCP protocol and asks for the retransmission of corrupted/lost packets. In the last section, we plot our scenario of streaming server on one side and client, end user on the other end and make assumption on the basis of throughput, response time and traffic.

    Download full text (pdf)
    FULLTEXT01
  • 8.
    Adabala, Yashwanth Venkata Sai Kumar
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Devanaboina, Lakshmi Venkata Raghava Sudheer
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    A Prevention Technique for DDoS Attacks in SDN using Ryu Controller Application2024Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Software Defined Networking (SDN) modernizes network control, offering streamlined management. However, its centralized structure makes it more vulnerable to distributed Denial of Service (DDoS) attacks, posing serious threats to network stability. This thesis explores the development of a DDoS attack prevention technique in SDN environments using the Ryu controller application. The research aims to address the vulnerabilities in SDN, particularly focusing on flooding and Internet Protocol (IP) spoofing attacks, which are a significant threat to network security. The study employs an experimental approach, utilizing tools like Mininet-VM (VirtualMachine), Oracle VM VirtualBox, and hping3 to simulate a virtual SDN environment and conduct DDoS attack scenarios. Key methodologies include packet sniffing and rule-based detection by integrating Snort IDS (Intrusion Detection System), which is critical for identifying and mitigating such attacks. The experiments demonstrate the effectiveness of the proposed prevention technique, highlighting the importance of proper configuration and integration of network security tools in SDN. This work contributes to enhancing the resilience of SDN architectures against DDoS attacks, offering insights into future developments in network security. 

    Download full text (pdf)
    A_Prevention_Technique_for_DDoS_Attacks_in_SDN_using_Ryu_Controller_Application
  • 9.
    Adamov, Alexander
    et al.
    Harkivskij Nacionalnij Universitet Radioelectroniki, UKR.
    Carlsson, Anders
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Cloud incident response model2016In: Proceedings of 2016 IEEE East-West Design and Test Symposium, EWDTS 2016, Institute of Electrical and Electronics Engineers (IEEE), 2016Conference paper (Refereed)
    Abstract [en]

    This paper addresses the problem of incident response in clouds. A conventional incident response model is formulated to be used as a basement for the cloud incident response model. Minimization of incident handling time is considered as a key criterion of the proposed cloud incident response model that can be done at the expense of embedding infrastructure redundancy into the cloud infrastructure represented by Network and Security Controllers and introducing Security Domain for threat analysis and cloud forensics. These architectural changes are discussed and applied within the cloud incident response model. © 2016 IEEE.

  • 10.
    Adamov, Alexander
    et al.
    Kharkiv National University of Radioelectronics, UKR.
    Carlsson, Anders
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    The state of ransomware: Trends and mitigation techniques2017In: Proceedings of 2017 IEEE East-West Design and Test Symposium, EWDTS 2017, Institute of Electrical and Electronics Engineers Inc. , 2017, article id 8110056Conference paper (Refereed)
    Abstract [en]

    This paper contains an analysis of the payload of the popular ransomware for Windows, Android, Linux, and MacOSX platforms. Namely, VaultCrypt (CrypVault), TeslaCrypt, NanoLocker, Trojan-Ransom.Linux.Cryptor, Android Simplelocker, OSX/KeRanger-A, WannaCry, Petya, NotPetya, Cerber, Spora, Serpent ransomware were put under the microscope. A set of characteristics was proposed to be used for the analysis. The purpose of the analysis is generalization of the collected data that describes behavior and design trends of modern ransomware. The objective is to suggest ransomware threat mitigation techniques based on the obtained information. The novelty of the paper is the analysis methodology based on the chosen set of 13 key characteristics that helps to determine similarities and differences thorough the list of ransomware put under analysis. Most of the ransomware samples presented were manually analyzed by the authors eliminating contradictions in descriptions of ransomware behavior published by different malware research laboratories through verification of the payload of the latest versions of ransomware. © 2017 IEEE.

  • 11.
    Adapa, Nagaswaroopa
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Bollu, Sravya
    Blekinge Institute of Technology, School of Engineering.
    Performance analysis of different adapative algorithms based on acoustic echo cancellation2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In modern telecommunication systems like hands-free and teleconferencing systems, the problem arise during conversation is the creation of an acoustic echo. This problem degrades the quality of the information signal. All speech processing equipments like noise cancelling headphones and hearing aids should be able to filter different kinds of interfering signals and produce a clear sound to the listener. Currently, echo cancellation is a most interesting and challenging task in any communication system. Echo is the delayed and degraded version of original signal which travels back to its source after several reflections. Eliminating this effect without affecting the original quality of the speech is a challenge of research in present days. Echo cancellation in voice communication is a process of removing the echo to improve the clarity and quality of the voice signals. In our thesis we mainly focused on the acoustic echo cancellation in a closed room using adaptive filters. The Acoustic echo cancellation with adaptive filtering technique will more accurately enhance the speech quality in hands free communication systems. The main aim of using adaptive algorithms for echo cancellation is to achieve higher ERLE at higher rate of convergence with low complexity. The adaptive algorithms NLMS, APA and RLS are implemented using MATLAB. These algorithms are tested with the simulation of echo occurring environment by using constant room dimensions , microphone and source positions. The performance of the NLMS, APA and RLS are evaluated in terms ERLE and misalignment. The results show that RLS algorithm achieve good performance with more computational complexity comparing with the NLMS and APA algorithms. The NLMS algorithm has very low computational complexity comparing to RLS and APA algorithms. The results are taken for both input signal as speech signal and noise separately and plotted in the results section.

    Download full text (pdf)
    FULLTEXT01
  • 12.
    Adapa, Sasank Sai Sujan
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    APPLYING LEAN PRINCIPLES FOR PERFORMANCE ORIENTED SERVICE DESIGN OF VIRTUAL NETWORK FUNCTIONS FOR NFV INFRASTRUCTURE: Concepts of Lean2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context. Network Function Virtualization was recently proposed by European Telecommunications Standards Institute (ETSI) to improve the network service flexibility by virtualization of network services and applications that run on hardware. To virtualize network functions, the software is decoupled from underlying physical hardware. NFV aims to transform industries by reducing capital investments on hardware by using commercial-of-the-shelf (COTS) hardware. NFV makes rapid innovative growth in telecom services through software based service deployment.

    Objectives. This thesis work aims to investigate how business organizations function and the roles in defining a service relationship model. The work also aims to define a service relationship model and to validate it via proof of concept using network function virtualization as a service. For this thesis, we finally apply lean principles for the defined service relationship model to reduce waste and investigate how lean benefits the model to be proven as performance service oriented.

    Methods. The essence of this work is to make a business organization lean by investigating its actions and applying lean principles. To elaborate, this thesis work involves in a research of papers from IEEE, TMF, IETF and Ericsson. It results in modelling of a PoC by following requirement analysis methodology and by applying lean principles to eliminate unnecessary processes which doesn’t add any value.

    Results. The results of the work include a full-fledged service relationship model that include three service levels with roles that can fit in to requirement specifications of NFV infrastructure. The results also show the service levels functionalities and their relationships between the roles. It has also been observed that the services that are needed to be standardized are defined with syntax for ways to describe network functions. It is observed that lean principles benefit the service relationship model from reducing waste factors and hereby providing a PoC which is performance service oriented.

    Conclusions. We conclude that roles defined are fit for the service relationship model designed. Moreover, we conclude that the model can hence contain the flow of service by standardizing the subservices and reducing waste interpreted with lean principles and there is a need for further use case proof of the model in full scale industry trials. It also concludes the ways to describe network functions syntax which follows lean principles that are essential to have them for the sub-services standardization. However, PoC defined can be an assurance to the NFV infrastructure.

    Download full text (pdf)
    fulltext
  • 13.
    Addu, Raj Kiran
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    Potuvardanam, Vinod Kumar
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    Effect of Codec Performance on Video QoE for videos encoded with Xvid, H.264 and WebM/VP82014Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In recent years, there has been a significant growth in multimedia services such as mobile video streaming, Video-on-Demand and video conferencing. This has led to the development of various video coding techniques, aiming to deliver high quality video while using available bandwidth efficiently. This upsurge in the usage of video applications has also resulted in making endusers more quality-conscious. In order to meet the users’ expectations, the Quality of Experience (QoE) studies has gained utmost importance from both researchers and service providers. This thesis aims to compare the performance of H.264/AVC, Xvid and WebM/VP8 video codecs in wired and wireless networks. The codec performance is evaluated for different packet loss and delay variation values. The evaluation of codec performance is done using both subjective and objective assessment methods. In subjective assessment method, the evaluation of video codec performance is done using ITU-T recommended Absolute Category Rating (ACR) method. Using this method the perceptual video quality ratings are taken from the users, which are then averaged to obtain Mean Opinion Score. These obtained scores are used to analyze the performance of encoded videos with respect to users’ perception. In addition to subjective assessment method, the quality of encoded video is also measured using objective assessment method. The objective metric SSIM (Structural Similarity) is used to evaluate the performance of encoded videos. Based on the results, it was found that for lower packet loss and delay variation values H.264 showed better results when compared to Xvid and WebM/VP8 whereas, WebM/VP8 outperformed Xvid and H.264 for higher packet loss and delay variation values. On the whole, H.264 and WebM/VP8 performed better than Xvid. It was also found that all three video codecs performed better in wired network when compared to the wireless network.

    Download full text (pdf)
    FULLTEXT01
  • 14.
    Adebayo, Emmanuel
    Blekinge Institute of Technology.
    Performance Assessment of Networked Immersive Media in Mobile Health Applications with Emphasis on Latency2021Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Cloud VR/AR/MR (Virtual Reality, Augmented Reality, and Mixed Reality) services representa high-level architecture that combines large scale computer resources in a data-center structurestyle set up to render VR/AR/MR services using a combination of very high bandwidth, ultralow latency, high throughput, latest 5G (5th Generation) mobile networks to the end users. 

    VR refers to a three-dimensional computer-generated virtual environment made up ofcomputers, which can be explored by people for real time interaction. AR amplifies humanperception of the real world through overlapping of computer-generated graphics or interactivedata on a real-world image for enhanced experience. 

    According to the Virtual Reality Society’s account of the history of VR, it started from the360-degree murals from the nineteenth century [18]. Historically, live application of AR wasdisplayed when Myron Kruger used a combination of video cameras and projector in aninteractive environment in 1974. In 1998, AR was put into live display with the casting of avirtual yellow line marker during an NFL game. However, personal, and commercial use ofVR/AR was made possible starting with release of a DIY (Do it Yourself) headset calledGoogle Cardboard in 2014 by Google, which made use of a smartphone for the VR experience.In 2014, Samsung also introduced Gear VR which officially started the competition for VRdevices. Subsequently In 2014, Facebook acquired Oculus VR with the major aim ofdominating the high-end spectrum of VR headset [18]. Furthermore, wider adoption of ARbecame enhanced with the introduction of Apple’s ARKit (Augmented Reality Kit) whichserves as a development framework for AR applications for iPhones and iPads [18]. 

    The first application of VR devices in the health industry was made possible due to healthworkers’ need to visualize complex medical data during surgery and planning of surgery in1994. Since then, commercial production of VR devices and availability of advanced networkand faster broadband have increased the adoption of VR services in the healthcare industryespecially in planning of surgery and during surgery itself [16]. Overall, the wide availabilityof VR/AR terminals, displays, controllers, development kits, advanced network, and robustbandwidth have contributed to making VR and AR services to be of valuable and importanttechnologies in the area of digital entertainment, information, games, health, military and soon. However, the solutions or services needed for the technology required an advancedprocessing platform which in most cases is not cost efficient in single-use scenarios. 

    The kind of devices, hardware, software required for the processing and presentation ofimmersive experiences is often expensive and dedicated to the current application itself.Technological improvement in realism and immersion means increase in cost of ownershipwhich often affected cost-benefit consideration, leading to slower adoption of the VR services[14] [15]. This is what has led to development of cloud VR services, a form of data-centerbased system, which serves as a means of providing VR services to end users from the cloudanywhere in the world, using its fast and stable transport networks. The content of the VR isstored in the cloud, after which the output in form of audio-visuals is coded and compressedusing suitable encoding technology, and thereafter transmitted to the terminals. The industrywide acceptance of the cloud VR services, and technology has made available access to payper-use-basis and hence access to high processing capability offered, which is used in iipresenting a more immersive, imaginative, and interactive experience to end users [11] [12].However, cloud VR services has a major challenge in form of network latency introduced fromcloud rendering down to the display terminal itself. This is most often caused by otherperformance indicators such as network bandwidth, coding technology, RTT (Return TripTime) and so on [19]. This is the major problem which this thesis is set to find out. 

    The research methodology used was a combination of empirical and experimental method,using quantitative approach as it entails the generation of data in quantitative form availablefor quantitative analysis. The research questions are:

    Research Question 1 (RQ1): What are the latency related performance indicators ofnetworked immersive media in mobile health applications?

    Research Question 2 (RQ2): What are the suitable network structures to achieve an efficientlow latency VR health application? 

    The answers gotten from the result analysis at the end of the simulation, show thatbandwidth, frame rate, and resolution are very crucial performance indicator to achieve theoptimal latency required for hitch-free cloud VR user experience, while the importance of otherindicators such as resolution and coding standard cannot be overemphasized. Combination ofedge and cloud architecture also proved to more efficient and effective for the achievement ofa low-latency cloud VR application functionality. 

    Conclusively, the answer to research question one was that, the latency relatedperformance indicators of networked immersive media in mobile health applications arebandwidth, frame rate, resolution, coding technology. For research question two, suitablenetwork structures includes edge network, cloud network and combination of cloud and edgenetwork, but in order to achieve an optimally low-latency network for cloud VR mobile healthapplication in education, combination of edge and cloud network architecture is recommended

    Download full text (pdf)
    fulltext
  • 15.
    Adebomi, OYEKANLU Emmanuel
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mwela, JOHN Samson
    Blekinge Institute of Technology, School of Computing.
    Impact of Packet Losses on the Quality of Video Streaming2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In this thesis, the impact of packet losses on the quality of received videos sent across a network that exhibit normal network perturbations such as jitters, delays, packet drops etc has been examined. Dynamic behavior of a normal network has been simulated using Linux and the Network Emulator (NetEm). Peoples’ perceptions on the quality of the received video were used in rating the qualities of several videos with differing speeds. In accordance with ITU’s guideline of using Mean Opinion Scores (MOS), the effects of packet drops were analyzed. Excel and Matlab were used as tools in analyzing the peoples’ opinions which indicates the impacts that different loss rates has on the transmitted videos. Statistical methods used for evaluation of data are mean and variance. We conclude that people have convergence of opinions when losses become extremely high on videos with highly variable scene changes

    Download full text (pdf)
    FULLTEXT01
  • 16.
    Adeleke, Adesina
    Blekinge Institute of Technology, School of Management.
    How External Forces are influencing the Ebusiness strategy of MTN-Nigeria2009Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The Internet and e-business has had enormous impact on many companies in Nigeria and there has been much research on how e-business influences the environment, but little can be found on how the environment of a developing country like Nigeria influences e-business. In e-business, technology tells the business what can be done in smarter ways. Technology not only can make business more efficient but also can make business more effective in targeting and reaching markets, however technology cannot enhance business in isolation as there are other vital factors that equally impact business. This thesis presents an adapted version of the PESTEL (Political, Economic, Socio cultural, Technology, Environment, and Legal) framework so called e-business PESTEL framework, as a method for structural analysis of macro environment forces in the future. In addition to this PESTEL framework, the Porter’s five forces model was employed to analyse the industrial forces that also influence MTNN e-business strategy. The main goal of this research is to give an overview of industry and macro-environment forces influencing the e-business strategy MTN-Nigeria and the impact of future developments. The research methodology was explorative and descriptive. A further method for future analysis of the macro-environments influences and a suggestion on how to incorporate it in this research work is given. The e-business strategy of MTNN consists of four areas: E-procurement, E-collaboration (CRM), Supply chain management and E-commerce. The influences found on macro-environments level are political and sociocultural forces and in the industry levels are bargaining power of customers and suppliers of its products and services .The most recommendations are that MTN-Nigeria should add e-business PESTEL framework described in this thesis to its e-business strategy check. Furthermore MTNN should include environment analysis more extensively in their e-business strategy approach as the factors in this research work shape the environment in which it carries out its business.

    Download full text (pdf)
    FULLTEXT01
  • 17.
    ADHIKARLA, VAMSI KIRAN
    Blekinge Institute of Technology, School of Engineering.
    3D VIDEO FORMAT CONVERSION2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The main aim of this thesis work is to find/implement various methods that convert Conventional Stereoscopic 3D Video (CSV) to Multiview video (MVV). The work investigates different methods that can produce multiple views given a stereoscopic pair from a frame of a particular video sequence and continues with the process of selecting the best among investigated methods that has optimum quality and speed. In contrast to the existing algorithms, this work disregards the physical depth but instead focus on pixel value correspondence. The intermediate view generation in this work is not considered as a geometrical problem, but a morphing problem. Different morphing algorithms (mesh, field and thin plate spline morphing techniques) are considered for conversion. Performance of each morphing algorithm is in turn compared using different correspondence matching techniqes. The investigated methods aim to produce arbitrary number of novel synthesized camera views from a sparse view set. Mesh morphing algorithm is found to be a better candidate in terms of signal to noise ratio, but requires accurate correspondences at edges of an object in a particular scene and also needs more execution time to generate more number of views. A new approach to field morphing has been introduced in this thesis work, which performs better in terms of execution time and also found to produce intermediate views with reasonable signal to noise ratio. This approach is observed to bring good trade off between speed and accuracy. This conversion has an advantage it can be used as a decompression mechanism that can produce multiple views required for an Autostereoscopic 3D display from a stereoscopic left and right pair. This approach also brings the benefit of backward compatibility as present standards for CSV may be used to provide multiview 3D video to high fidelity Autostereoscopic 3D displays of the future. This work has applications in free view point television, video conferencing systems etc.,

    Download full text (pdf)
    FULLTEXT01
  • 18.
    Adidamu, Naga Shruti
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Bheemisetty, Shanmukha Sai
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Assessment of Ixia BreakingPoint Virtual Edition: Evolved Packet Gateway2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Download full text (pdf)
    fulltext
  • 19.
    Adolfsson, Henrik
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Svensson, Peter
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Design and implementation of the MMS portal2006Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    MMS-enabled terminals on the market today are very complicated to use. It takes several steps to create a multi-slide MMS-message with images and text. This discourages users from using it. To increase usage of MMS, several companies provide web-based or stand-alone programs that allow users to create and send MMS-messages from a regular computer. However these editors have many limitations and are not user-friendly. This thesis describes the design and implementation of a user-friendly web-based MMS-portal where users can create, edit and send MMS-messages. The portal is integrated into Densitet’s system for development of mobile services. Conclusions that can be draw from this work are that problems with MMS interoperability have mostly the poor standardization to blame. Different terminals support different types of images and sound formats, and to make the MMS-portal user-friendly, format conversions of uploaded content had to be implemented. Also the MMS-portal only supports basic MMS-functionality. If the MMS-specification includes more audio and image formats and if the MMS-terminals are upgraded to handle these formats, sending MMS-messages will be easier and mobile messaging will continue to grow.

    Download full text (pdf)
    FULLTEXT01
  • 20. Adolfsson, Stefan
    On Automatic Detection of Burn-through Using a Parametric Model1995Report (Other academic)
  • 21. Adolfsson, Stefan
    Quality Monitoring in Pulsed GMA Welding Using Modern Signal Processing Methods1995Licentiate thesis, comprehensive summary (Other academic)
  • 22. Adolfsson, Stefan
    Quility Monitoring in Robotised Short Circuiting GMA Welding1997Report (Other academic)
    Abstract [en]

    This paper addresses the problem of automatic monitoring the weld quality produced by robotised short arc welding. A simple statistical change detection algorithm for the weld quality, recursive Sequential Probability Ratio Test (SPRT), is used. The algorithm may equivalently be viewed as a cumulative sum (CUSUM) - type test. The test statistics is based upon the variance of the amplitude of the weld voltage. The performance of the algorithm is evaluated using experimental data. The results obtained from the algorithm indicate that it is possible to detect changes in the weld quality automatically and on-line.

    Download full text (pdf)
    FULLTEXT01
  • 23. Adolfsson, Stefan
    et al.
    Bahrami, Ali
    Bolmsjö, Gunnar
    Claesson, Ingvar
    Automatic quality monitoring in robotised GMA welding using a repeated sequential probability ratio test method1997In: International Journal for the Joining of Materials, ISSN 0905-6866, Vol. 9, no 1, p. 2-8Article in journal (Refereed)
  • 24. Adolfsson, Stefan
    et al.
    Bahrami, Ali
    Claesson, Ingvar
    A Sequential Probability Ratio Test Method for Quality Monitoring in Robotised GMA Welding1997Conference paper (Refereed)
    Abstract [en]

    This paper deals with the problem of automatic monitoring the weld quality when welding with Gas Metal Arc (GMA) in short circuiting mode. Experiments with two different types of T-joints are performed in order to provoke optimal and non-optimal welding conditions. During the experiments, voltage and current are measured from the welding process. A simple statistical change detection algorithm for the weld quality, the repeated Sequential Probability Ratio Test (SPRT), is used. The algorithm can equivalently be viewed as a cumulative sum (CUSUM) - type test. The test statistics is based upon the fluctuations of amplitude in the weld voltage. It is shown that the fluctuations of the weld voltage amplitude decreases when the welding process is not operating under optimal condition. The results obtained from the experiments indicate that it is possible to detect changes in the weld quality automatically and on-line.

  • 25. Adolfsson, Stefan
    et al.
    Bahrami, Ali
    Claesson, Ingvar
    Quality Monitoring in Robotised Welding using Sequential Probability Ratio test1996Conference paper (Refereed)
    Abstract [en]

    This paper addresses the problem of automatic monitoring the weld quality produced by robotised short arc welding. A simple statistical change detection algorithm for the weld quality, recursive sequential probability ratio test (SPRT), is used. The algorithm may equivalently be viewed as a cumulative sum (CUSUM) type test. The test statistics is based upon the variance of the amplitude of the weld voltage. It is shown that the variance of the weld voltage amplitude decreases when the welding process is not operating under optimal condition. The performance of the algorithm is evaluated using experimental data. The results obtained from the algorithm indicate that it is possible to detect changes in the weld quality automatically and on-line

  • 26. Adolfsson, Stefan
    et al.
    Bahrami, Ali
    Claesson, Ingvar
    Bolmsjö, Gunnar
    On-line quality monitoring in short: circuit gas metal arc welding1999In: Welding Journal, ISSN 0043-2296, Vol. 78, no 2, p. 59S-73SArticle in journal (Refereed)
    Abstract [en]

    This paper addresses the problems involved in the automatic monitoring of the weld quality produced by robotized short-arc welding. A simple statistical change detection algorithm for the weld quality, the repeated Sequential Probability Ratio Test (SPRT), was used. The algorithm may similarly be viewed as a cumulative sum (CUSUM) type test, and is well-suited to detecting sudden minor changes in the monitored test statistic. The test statistic is based on the variance of the weld voltage, wherein it will be shown that the variance decreases when the welding process is not operating under optimal conditions. The performance of the algorithm is assessed through the use of experimental data. The results obtained from the algorithm show that it is possible to detect changes in weld quality automatically and on-line.

  • 27. Adolfsson, Stefan
    et al.
    Ericson, Klas
    Grennberg, Anders
    Automatic Detection of Burn-through in GMA Welding Using a Parametric Model1996In: Mechanical Systems & Signal Processing, ISSN 0888-3270 , Vol. 10, no 5, p. 633-651Article in journal (Refereed)
    Abstract [en]

    This paper addresses the problem of automatic detection of burn-through in weld joints. Gas metal are (GMA) welding with pulsed current is used, and welding voltage and current are recorded. As short-circuitings are common between the welding electrode and the work piece during burn-through, a short-circuit detector is developed to detect these events. To detect another specific characteristic of burn-through-a broadband long-lasting voltage component-this detector is combined with a square-law detector. This second detector is based on a non-linear modification of an autoregressive model with extra input (ARX-model) of the welding process. The results obtained from this compound detector indicate that it is possible to detect burn-through in the welds automatically. The work also indicates that it is possible to design an on-line monitoring system for robotic GMA welding.

  • 28. Adolfsson, Stefan
    et al.
    Ericson, Klas
    Gustavsson, Jan-Olof
    Ågren, Björn
    Quality Monitoring for Pulsed Arc Welding1994Conference paper (Refereed)
  • 29. Adolfsson, Stefan
    et al.
    Ericson, Klas
    Ågren, Björn
    On Automatic Detection of Burn-through in GMA Welding: Weld Voltage Analysis1995Report (Other academic)
  • 30.
    Advaita, Advaita
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Gali, Mani Meghala
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Performance Analysis of a MIMO Cognitive Cooperative Radio Network with Multiple AF Relays2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    With the rapid growth of wireless communications, the demand for the various multimedia services is increasing day by day leading to a deficit in the frequency spectrum resources. To overcome this problem, the concept of cognitive radio technology has been proposed which allows the unlicensed secondary user (SU) to access the licensed spectrum of the primary user (PU), thus improving the spectrum utilization. Cooperative communications is another emerging technology which is capable of overcoming many limitations in wireless systems by increasing reliability and coverage. The transmit and receive diversity techniques such as orthogonal space–time block codes (OSTBCs) and selection combining (SC) in multiple-input multiple-output (MIMO) cognitive amplify and forward relay networks help to reduce the effects of fading, increase reliability and extend radio coverage.

     

    In this thesis, we consider a MIMO cognitive cooperative radio network (CCRN) with multiple relays. The protocol used at the relays is an amplify and forward protocol. At the receiver, the SC technique is applied to combine the signals. Analytical expressions for the probability density function (PDF) and cumulative distribution function (CDF) of the signal-to-noise ratio (SNR) are derived. On this basis, the performance in terms of outage probability is obtained. Mathematica has been used to generate numerical results from the analytical expressions. The system model is simulated in MATLAB to verify the numerical results. The performance analysis of the system model is hence done in terms of outage probability.

    Download full text (pdf)
    BTH2016Advaita
  • 31.
    Advaita, Advaita
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Gali, Mani Meghala
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Chu, Thi My Chinh
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Zepernick, Hans-Juergen
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies. Blekinge Inst Technol, SE-37179 Karlskrona, Sweden..
    Outage Probability of MIMO Cognitive Cooperative Radio Networks with Multiple AF Relays Using Orthogonal Space-Time Block Codes2017In: 2017 IEEE 13TH INTERNATIONAL CONFERENCE ON WIRELESS AND MOBILE COMPUTING, NETWORKING AND COMMUNICATIONS (WIMOB), IEEE , 2017, p. 84-89Conference paper (Refereed)
    Abstract [en]

    In this paper, we analyze the outage probability of multiple-input multiple-output cognitive cooperative radio networks (CCRNs) with multiple opportunistic amplify-and-forward relays. The CCRN applies underlay spectrum access accounting for the interference power constraint of a primary network and utilizes orthogonal space-time block coding to transmit multiple data streams across a number of antennas over several time slots. As such, the system exploits both time and space diversity to improve the transmission reliability over Nakagami.. fading. The CCRN applies opportunistic relaying in which the relay offering the highest signal-to-noise ratio at the receiver is selected to forward the transmit signal. Furthermore, selection combining is adopted at the secondary receiver to process the signal from the direct and relaying transmissions. To evaluate system performance, we derive an expression for the outage probability which is valid for an arbitrary number of antennas at the source, relays, and receiver of the CCRN. Selected numerical results are provided using Mathematica for analysis and Matlab for simulations, to reveal the effect of network parameters on the outage probability of the system.

  • 32.
    Aeddula, Omsri
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Flyborg, Johan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Larsson, Tobias
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Anderberg, Peter
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Sanmartin Berglund, Johan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Renvert, Stefan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health. Kristianstad University, SWE.
    A Solution with Bluetooth Low Energy Technology to Support Oral Healthcare Decisions for improving Oral Hygiene2021In: ACM International Conference Proceeding Series, Association for Computing Machinery (ACM), 2021, Vol. 1, p. 134-139Conference paper (Refereed)
    Abstract [en]

    The advent of powered toothbrushes and associated mobile health applications provides an opportunity to collect and monitor the data, however collecting reliable and standardized data from large populations has been associated with efforts from the participants and researchers. Finding a way to collect data autonomously and without the need for cooperation imparts the potential to build large knowledge banks. A solution with Bluetooth low energy technology is designed to pair a powered toothbrush with a single-core processor to collect raw data in a real-time scenario, eliminating the manual transfer of powered toothbrush data with mobile health applications. Associating powered toothbrush with a single-core processor is believed to provide reliable and comprehensible data of toothbrush use and propensities can be a guide to improve individual exhortation and general plans on oral hygiene quantifies that can prompt improved oral wellbeing. The method makes a case for an expanded chance to plan assistant capacities to protect or improve factors that influence oral wellbeing in individuals with mild cognitive impairment. The proposed framework assists with determining various parameters, which makes it adaptable and conceivable to execute in various oral care contexts 

    Download full text (pdf)
    ICMHI-OKA
  • 33.
    Aeddula, Omsri
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Gertsovich, Irina
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mathematics and Natural Sciences.
    Image-Based Localization System2020In: Proceedings of the 8th ICIECE 2019, Springer , 2020, Vol. 107, p. 535-541Conference paper (Refereed)
    Abstract [en]

    The position of a vehicle is essential for navigation of the vehicle along the desired path without a human interference. A good positioning system should have both good positioning accuracy and reliability. Global Positioning System (GPS) employed for navigation in a vehicle may lose significant power due to signal attenuation caused by construction buildings or other obstacles. In this paper, a novel real-time indoor positioning system using a static camera is presented. The proposed positioning system exploits gradient information evaluated on the camera video stream to recognize the contours of the vehicle. Subsequently, the mass center of the vehicle contour is used for simultaneous localization of the vehicle. This solution minimizes the design and computational complexity of the positioning system. The experimental evaluation of the proposed approach has demonstrated the positioned accuracy of 92.26%. © Springer Nature Singapore Pte Ltd. 2020.

  • 34.
    Afaq, Muhammad
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Faheem, Sahibzada Muhammad
    Blekinge Institute of Technology, School of Engineering.
    Performance Analysis of Selected Cooperative Relaying Techniques2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Recently, cooperative communication has gained significant interest due to the fact that it exploits spatial diversity and provides capacity/performance gain over conventional single- input single-output (SISO) systems. A mobile node with single antenna can cooperate with a nearby mobile node having single antenna in multi-user environment to create the effect of virtual multiple antenna system. Hence, reducing the complexity associated with actual multiple antenna systems. Despite the small size and power constraints, a mobile node can still benefit from spatial diversity by employing cooperation, thus saving transmission power and increasing the coverage range of the network. In this thesis, we have selected some of relaying protocols, namely, amplify-and-forward, decode-and-forward, detect-and-forward, and selective detect-and-forward that are studied and implemented for two different relaying geometries, i.e. equidistant and collinear. Results are studied and compared with each other to show the performance of each protocol in terms of average symbol error probabilities. The considered system model has three nodes, i.e. source, relay, destination. Communicating nodes are considered to be half-duplex with single antenna for transmission and reception. The source, when communicating with the destination, broadcasts the information, which is heard by the nearby relay. The relay then uses one of the cooperation protocols. Finally, the relayed signal reaches the destination, where it is detected by maximal ratio combiner (MRC) and combined with the direct transmission for possible diversity gains. The transmission path or the channel is modeled as a frequency non-selective Rayleigh fading in the presence additive white Gaussian noise (AWGN). The effect of path loss has been observed on cooperation for collinear arrangement with exponential decay up to four. Considering equidistant arrangement, decode-and-forward shows good performance at high signal-to-noise ratio (SNR) while amplify-and-forward is very promising for very low SNR. A selective relaying scheme called selective detect-and- forward is also presented which outperforms its fixed counterparts for a wide range of SNR.

    Download full text (pdf)
    FULLTEXT01
  • 35.
    Aftab, Adnan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mufti, Muhammad Nabeel
    Blekinge Institute of Technology, School of Computing.
    Spectrum sensing through implementation of USRP22011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Scarcity of the wireless spectrum has led to the development of new techniques for better utilization of the wireless spectrum. Demand for high data rates and better voice quality is resulting in the development of new wireless standard making wireless spectrum limited than ever. In this era of wireless communication, service providers and telecom operators are faced with a dilemma where they need a large sum of the wireless spectrum to meet the ever increasing quality of service requirements of consumers. This has led to the development of spectrum sensing techniques to find the unused spectrum in the available frequency band. The results presented in this thesis will help out in developing clear understanding of spectrum sensing techniques. Comparison of different spectrum sensing approaches. The experiments carried out using USRP2 and GNU radio will help the reader to understand the concept of underutilized frequency band and its importance in Cognitive Radios.

    Download full text (pdf)
    FULLTEXT01
  • 36.
    Ahmad, Nadeem
    et al.
    Blekinge Institute of Technology, School of Computing.
    Habib, M. Kashif
    Blekinge Institute of Technology, School of Computing.
    Analysis of Network Security Threats and Vulnerabilities by Development & Implementation of a Security Network Monitoring Solution2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Communication of confidential data over the internet is becoming more frequent every day. Individuals and organizations are sending their confidential data electronically. It is also common that hackers target these networks. In current times, protecting the data, software and hardware from viruses is, now more than ever, a need and not just a concern. What you need to know about networks these days? How security is implemented to ensure a network? How is security managed? In this paper we will try to address the above questions and give an idea of where we are now standing with the security of the network.

    Download full text (pdf)
    FULLTEXT01
  • 37.
    Ahmad, Naseer
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Security Issues in Wireless Systems2009Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    ireless Communication is one of the fields of Telecommunications which is growing with the tremendous speed. With the passage of time wireless communication devices are becoming more and more common. It is not only the technology of business but now people are using it to perform their daily tasks, be it for calling, shopping, checking their emails or transfer their money. Wireless communication devices include cellular phones, cordless phones and satellite phones, smart phones like Personal Digital Assistants (PDA), two way pagers, and lots of their devices are on their way to improve this wireless world. In order to establish two way communications, a wireless link may be using radio waves or Infrared light. The Wireless communication technologies have become increasingly popular in our everyday life. The hand held devices like Personal Digital Assistants (PDA) allow the users to access calendars, mails, addresses, phone number lists and the internet. Personal digital assistants (PDA) and smart phones can store large amounts of data and connect to a broad spectrum of networks, making them as important and sensitive computing platforms as laptop PCs when it comes to an organization’s security plan. Today’s mobile devices offer many benefits to enterprises. Mobile phones, hand held computers and other wireless systems are becoming a tempting target for virus writers. Mobile devices are the new frontier for viruses, spam and other potential security threats. Most viruses, Trojans and worms have already been created that exploit vulnerabilities. With an increasing amount of information being sent through wireless channels, new threats are opening up. Viruses have been growing fast as handsets increasingly resemble small computers that connect with each other and the internet. Hackers have also discovered that many corporate wireless local area networks (WLAN) in major cities were not properly secured. Mobile phone operators say that it is only a matter of time before the wireless world is hit by the same sorts of viruses and worms that attack computer software.

    Download full text (pdf)
    FULLTEXT01
  • 38.
    Ahmad, Waqas
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Aslam, Muhammad Kashif
    Blekinge Institute of Technology, School of Engineering.
    An investigation of Routing Protocols in Wireless Mesh Networks (WMNs) under certain Parameters2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Wireless Mesh Networks (WMNs) are bringing revolutionary change in the field of wireless networking. It is a trustworthy technology in applications like broadband home networking, network management and latest transportation systems. WMNs consist of mesh routers, mesh clients and gateways. It is a special kind of wireless Ad-hoc networks. One of the issues in WMNs is resource management which includes routing and for routing there are particular routing protocols that gives better performance when checked with certain parameters. Parameters in WMNs include delay, throughput, network load etc. There are two types of routing protocols i.e. reactive protocols and proactive protocols. Three routing protocols AODV, DSR and OLSR have been tested in WMNs under certain parameters which are delay, throughput and network load. The testing of these protocols will be performed in the Optimized Network Evaluation Tool (OPNET) Modeler 14.5. The obtained results from OPNET will be displayed in this thesis in the form of graphs. This thesis will help in validating which routing protocol will give the best performance under the assumed conditions. Moreover this thesis report will help in doing more research in future in this area and help in generating new ideas in this research area that will enhance and bring new features in WMNs.

    Download full text (pdf)
    FULLTEXT01
  • 39.
    Ahmad, Zunnurain
    Blekinge Institute of Technology, School of Engineering.
    Design and Implementation of Quasi Planar K-Band Array Antenna Based on Travelling Wave Structures2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Designing antenna arrays based on travelling wave structures has been studied extensively during the past couple of decades and literature on several array topologies is present. It has been an active area of research as constant improvement of antenna arrays is desired for different communication systems developed. Slotted waveguide antennas are one form of travelling wave structures which is adapted in this study due to several advantages offered over other planar array structures. Waveguide slots have been used for a couple of decades as radiating elements. Several design studies have been carried out regarding use of slots with different orientations and geometry and cascading them together to be used as array antennas. Waveguide antennas are preferred as they provide low losses in the feed structure and also offer good radiation characteristics. This study provides a design procedure for implementing a circular polarized planar antenna array based on slotted waveguide structures. The antenna is designed to work in the 19.7 - 20.2 GHz range which is the operating frequency for the downlink of satellites.

    Download full text (pdf)
    FULLTEXT01
  • 40.
    Ahmadi Mehri, Vida
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    An Investigation of CPU utilization relationship between host and guests in a Cloud infrastructure2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Cloud computing stands as a revolution in IT world in recent years. This technology facilitates resource sharing by reducing hardware costs for business users and promises energy efficiency and better resource utilization to the service providers. CPU utilization is a key metric considered in resource management across clouds.

    The main goal of this thesis study is directed towards investigating CPU utilization behavior with regard to host and guest, which would help us in understanding the relationship between them. It is expected that perception of these relationships would be helpful in resource management.

    Working towards our goal, the methodology we adopted is experi- mental research. This involves experimental modeling, measurements and observations from the results. The experimental setup covers sev- eral complex scenarios including cloud and a standalone virtualization system. The results are further analyzed for a visual correlation.

    Results show that CPU utilization in cloud and virtualization sce- nario coincides. More experimental scenarios are designed based on the first observations. The obtaining results show the irregular behav- ior between PM and VM in variable workload.

    CPU utilization retrieved from both cloud and a standalone system is similar. 100% workload situations showed that CPU utilization is constant with no correlation co-efficient obtained. Lower workloads showed (more/less) correlation in most of the cases in our correlation analysis. It is expected that more number of iterations can possibly vary the output. Further analysis of these relationships for proper resource management techniques will be considered. 

    Download full text (pdf)
    fulltext
  • 41.
    Ahmadi Mehri, Vida
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Towards Secure Collaborative AI Service Chains2019Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    At present, Artificial Intelligence (AI) systems have been adopted in many different domains such as healthcare, robotics, automotive, telecommunication systems, security, and finance for integrating intelligence in their services and applications. The intelligent personal assistant such as Siri and Alexa are examples of AI systems making an impact on our daily lives. Since many AI systems are data-driven systems, they require large volumes of data for training and validation, advanced algorithms, computing power and storage in their development process. Collaboration in the AI development process (AI engineering process) will reduce cost and time for the AI applications in the market. However, collaboration introduces the concern of privacy and piracy of intellectual properties, which can be caused by the actors who collaborate in the engineering process.  This work investigates the non-functional requirements, such as privacy and security, for enabling collaboration in AI service chains. It proposes an architectural design approach for collaborative AI engineering and explores the concept of the pipeline (service chain) for chaining AI functions. In order to enable controlled collaboration between AI artefacts in a pipeline, this work makes use of virtualisation technology to define and implement Virtual Premises (VPs), which act as protection wrappers for AI pipelines. A VP is a virtual policy enforcement point for a pipeline and requires access permission and authenticity for each element in a pipeline before the pipeline can be used.  Furthermore, the proposed architecture is evaluated in use-case approach that enables quick detection of design flaw during the initial stage of implementation. To evaluate the security level and compliance with security requirements, threat modeling was used to identify potential threats and vulnerabilities of the system and analyses their possible effects. The output of threat modeling was used to define countermeasure to threats related to unauthorised access and execution of AI artefacts.

    Download full text (pdf)
    fulltext
  • 42.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Arlos, Patrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Casalicchio, Emiliano
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Automated Context-Aware Vulnerability Risk Management for Patch Prioritization2022In: Electronics, E-ISSN 2079-9292, Vol. 11, no 21, article id 3580Article in journal (Refereed)
    Abstract [en]

    The information-security landscape continuously evolves by discovering new vulnerabilities daily and sophisticated exploit tools. Vulnerability risk management (VRM) is the most crucial cyber defense to eliminate attack surfaces in IT environments. VRM is a cyclical practice of identifying, classifying, evaluating, and remediating vulnerabilities. The evaluation stage of VRM is neither automated nor cost-effective, as it demands great manual administrative efforts to prioritize the patch. Therefore, there is an urgent need to improve the VRM procedure by automating the entire VRM cycle in the context of a given organization. The authors propose automated context-aware VRM (ACVRM), to address the above challenges. This study defines the criteria to consider in the evaluation stage of ACVRM to prioritize the patching. Moreover, patch prioritization is customized in an organization’s context by allowing the organization to select the vulnerability management mode and weigh the selected criteria. Specifically, this study considers four vulnerability evaluation cases: (i) evaluation criteria are weighted homogeneously; (ii) attack complexity and availability are not considered important criteria; (iii) the security score is the only important criteria considered; and (iv) criteria are weighted based on the organization’s risk appetite. The result verifies the proposed solution’s efficiency compared with the Rudder vulnerability management tool (CVE-plugin). While Rudder produces a ranking independent from the scenario, ACVRM can sort vulnerabilities according to the organization’s criteria and context. Moreover, while Rudder randomly sorts vulnerabilities with the same patch score, ACVRM sorts them according to their age, giving a higher security score to older publicly known vulnerabilities. © 2022 by the authors.

    Download full text (pdf)
    fulltext
  • 43.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Arlos, Patrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Casalicchio, Emiliano
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Normalization Framework for Vulnerability Risk Management in Cloud2021In: Proceedings - 2021 International Conference on Future Internet of Things and Cloud, FiCloud 2021, IEEE, 2021, p. 99-106Conference paper (Refereed)
    Abstract [en]

    Vulnerability Risk Management (VRM) is a critical element in cloud security that directly impacts cloud providers’ security assurance levels. Today, VRM is a challenging process because of the dramatic increase of known vulnerabilities (+26% in the last five years), and because it is even more dependent on the organization’s context. Moreover, the vulnerability’s severity score depends on the Vulnerability Database (VD) selected as a reference in VRM. All these factors introduce a new challenge for security specialists in evaluating and patching the vulnerabilities. This study provides a framework to improve the classification and evaluation phases in vulnerability risk management while using multiple vulnerability databases as a reference. Our solution normalizes the severity score of each vulnerability based on the selected security assurance level. The results of our study highlighted the role of the vulnerability databases in patch prioritization, showing the advantage of using multiple VDs.

    Download full text (pdf)
    fulltext
  • 44.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science. City Network International AB, Sweden.
    Arlos, Patrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Casalicchio, Emiliano
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science. Sapienza University of Rome, ITA.
    Normalization of Severity Rating for Automated Context-aware Vulnerability Risk Management2020In: Proceedings - 2020 IEEE International Conference on Autonomic Computing and Self-Organizing Systems Companion, ACSOS-C 2020, Institute of Electrical and Electronics Engineers (IEEE), 2020, p. 200-205, article id 9196350Conference paper (Refereed)
    Abstract [en]

    In the last three years, the unprecedented increase in discovered vulnerabilities ranked with critical and high severity raise new challenges in Vulnerability Risk Management (VRM). Indeed, identifying, analyzing and remediating this high rate of vulnerabilities is labour intensive, especially for enterprises dealing with complex computing infrastructures such as Infrastructure-as-a-Service providers. Hence there is a demand for new criteria to prioritize vulnerabilities remediation and new automated/autonomic approaches to VRM.

    In this paper, we address the above challenge proposing an Automated Context-aware Vulnerability Risk Management (AC- VRM) methodology that aims: to reduce the labour intensive tasks of security experts; to prioritize vulnerability remediation on the basis of the organization context rather than risk severity only. The proposed solution considers multiple vulnerabilities databases to have a great coverage on known vulnerabilities and to determine the vulnerability rank. After the description of the new VRM methodology, we focus on the problem of obtaining a single vulnerability score by normalization and fusion of ranks obtained from multiple vulnerabilities databases. Our solution is a parametric normalization that accounts for organization needs/specifications.

    Download full text (pdf)
    fulltext
  • 45.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Ilie, Dragos
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Designing a Secure IoT System Architecture from a Virtual Premise for a Collaborative AI Lab2019Conference paper (Refereed)
    Abstract [en]

    IoT systems are increasingly composed out of flexible, programmable, virtualised, and arbitrarily chained IoT elements and services using portable code. Moreover, they might be sliced, i.e. allowing multiple logical IoT systems (network + application) to run on top of a shared physical network and compute infrastructure. However, implementing and designing particularly security mechanisms for such IoT systems is challenging since a) promising technologies are still maturing, and b) the relationships among the many requirements, technologies and components are difficult to model a-priori.

    The aim of the paper is to define design cues for the security architecture and mechanisms of future, virtualised, arbitrarily chained, and eventually sliced IoT systems. Hereby, our focus is laid on the authorisation and authentication of user, host, and code integrity in these virtualised systems. The design cues are derived from the design and implementation of a secure virtual environment for distributed and collaborative AI system engineering using so called AI pipelines. The pipelines apply chained virtual elements and services and facilitate the slicing of the system. The virtual environment is denoted for short as the virtual premise (VP). The use-case of the VP for AI design provides insight into the complex interactions in the architecture, leading us to believe that the VP concept can be generalised to the IoT systems mentioned above. In addition, the use-case permits to derive, implement, and test solutions. This paper describes the flexible architecture of the VP and the design and implementation of access and execution control in virtual and containerised environments. 

    Download full text (pdf)
    fulltext
  • 46.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Ilie, Dragos
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Privacy and DRM Requirements for Collaborative Development of AI Application2019In: ACM International Conference Proceeding Series, Association for Computing Machinery (ACM), 2019, article id 3233268Conference paper (Refereed)
    Abstract [en]

    The use of data is essential for the capabilities of Data-driven Artificial intelligence (AI), Deep Learning and Big Data analysis techniques. This data usage, however, raises intrinsically the concerns on data privacy. In addition, supporting collaborative development of AI applications across organisations has become a major need in AI system design. Digital Rights Management (DRM) is required to protect intellectual property in such collaboration. As a consequence of DRM, privacy threats and privacy-enforcing mechanisms will interact with each other.

    This paper describes the privacy and DRM requirements in collaborative AI system design using AI pipelines. It describes the relationships between DRM and privacy and outlines the threats against these non-functional features. Finally, the paper provides first security architecture to protect against the threats on DRM and privacy in collaborative AI design using AI pipelines. 

    Download full text (pdf)
    fulltext
  • 47.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Ilie, Dragos
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Towards Privacy Requirements for Collaborative Development of AI Applications2018In: 14th Swedish National Computer Networking Workshop (SNCNW), 2018Conference paper (Refereed)
    Abstract [en]

    The use of data is essential for the capabilities of Data- driven Artificial intelligence (AI), Deep Learning and Big Data analysis techniques. The use of data, however, raises intrinsically the concern of the data privacy, in particular for the individuals that provide data. Hence, data privacy is considered as one of the main non-functional features of the Next Generation Internet. This paper describes the privacy challenges and requirements for collaborative AI application development. We investigate the constraints of using digital right management for supporting collaboration to address the privacy requirements in the regulation.

    Download full text (pdf)
    fulltext
  • 48.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Flexible Privacy and High Trust in the Next Generation Internet: The Use Case of a Cloud-based Marketplace for AI2017Conference paper (Refereed)
    Abstract [en]

    Cloudified architectures facilitate resource ac-cess and sharing which is independent from physical lo-cations. They permit high availability of resources at lowoperational costs. These advantages, however, do not comefor free. End users might fear that they lose control overthe location of their data and, thus, of their autonomy indeciding to whom the data is communicate to. Thus, strongprivacy and trust concerns arise for end users.In this work we will review and investigate privacy andtrust requirements for Cloud systems in general and for acloud-based marketplace (CMP) for AI in particular. We willinvestigate whether and how the current privacy and trustdimensions can be applied to Clouds and for the design ofa CMP. We also propose the concept of a "virtual premise"for enabling "Privacy-by-Design" [1] in Clouds. The ideaof a "virtual premise" might probably not be a universalsolution for any privacy requirement. However, we expectthat it provides flexibility in designing privacy in Cloudsand thus leading to higher trust.

    Download full text (pdf)
    fulltext
  • 49.
    ahmed, amar
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Performance and Modeling of SIP Session Setup2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    During the recent last years, transport of multimedia sessions, such as audio streams and video conferences, over IP has acquired a lot of attention since most of communication technologies are migrating to work over IP. However, sending media streams over IP networks has encountered some problems related to signaling issues. The ongoing research in this area has produced some solutions to this subject. Internet Engineering Task Force (IETF) has introduced Session Initiation Protocol (SIP), which has proved to be an efficient protocol for controlling sessions over IP. While a great deal of research performed in evaluating the performance of SIP and comparing it with its competent protocols such as H.323, studying the delay caused by initiating the session has acquired less attention. In this document, we have addressed the SIP session setup delay problem. In the lab, we have built up a test bed for running several SIP session scenarios. Using different models for those scenarios, we have measured session setup delays for all used models. The analysis performed for each model showed that we could propose some models to be applied for SIP session setup delay components.

    Download full text (pdf)
    FULLTEXT01
  • 50.
    Ahmed, Ashraf AwadElkarim Widaa
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Makki, Ahmed Hamza Ibrahim
    Blekinge Institute of Technology, School of Engineering.
    Performance Evaluation of Uplink Multiple Access Techniques in LTE Mobile Communication System2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The User Equipments (UE) nowadays are able to provide various internet applications and services that raise the demand for high speed data transfer and Quality of Service (QoS). Accordingly, next generation mobile communication systems driven by these demands are expected to provide higher data rates and better link quality compared to the existing systems. Orthogonal Frequency Division Multiple Access (OFDMA) and Single Carrier Frequency Division Multiple Access (SC-FDMA) are strong multiple access candidates for the uplink of the International Mobile Telecommunications-Advanced (IMT-Advanced). These multiple access techniques in combination with other promising technologies such as multi-hops transmission and Multiple-Input-Multiple-Output (MIMO) will be utilized to reach the targeted IMT-Advanced system performance. In this thesis, OFDMA and SC-FDMA are adopted and studied in the uplink of Long Term Evolution (LTE). Two transmission scenarios are considered, namely the single hop transmission and the relay assisted transmission (two hops). In addition, a hybrid multiple access technique that combines the advantages of OFDMA and SC-FDMA in term of low Peak-to-Average Power Ratio (PAPR) and better link performance (in terms of Symbol Error Rate (SER)) has been proposed in relay assisted transmission scenario. Simulation results show that the proposed hybrid technique achieves better end-to-end link performance in comparison to the pure SC-FDMA technique and maintains the same PAPR value in access link. In addition, a lower PAPR is achieved compared to OFDMA case, which is an important merit in the uplink transmission due to the UE’s power resources constraint (limited battery power).

    Download full text (pdf)
    FULLTEXT01
1234567 1 - 50 of 3243
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf