Change search
Refine search result
3456789 251 - 300 of 17072
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 251.
    Ahmad, Khalid
    et al.
    Blekinge Institute of Technology, School of Management.
    Azumah, Kenneth
    Blekinge Institute of Technology, School of Management.
    Employee Retention Strategies: the case of a patent firm in Australia2012Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Retaining employees is an important goal of every organization. This thesis explores the factors that can significantly impact employee retention in an organisation. It attempts to relate some of the factors discovered to major theories such as the Employee Equity Model, Herzberg’s (Two-Factor) Theory and the Job Embeddedness Theory. The literature surveyed by this study mention employee motivation, job satisfaction and job embeddedness as the main factors that influence employee retention rates. The study proposes that job embeddedness is a superior model that significantly explains employee retention. The population for the study were 53 respondents out of 75 taken from a patent firm in Australia, a representation of the rapidly growing knowledge industry. The participants of the survey were contacted through private email and selected for the study by simple random sampling done via the listing of the employee names in a spreadsheet program. The survey questions were categorized under six major theories of employee retention with each category having an average of five questions. Four most significant theories emerging were compared and the theory best explaining employee retention was chosen. The four most significant theories were Employee Equity Model, Herzberg’s (Two-Factor) Theory and the Job Embeddedness Theory, and the one that most explains employee retention was Herzberg’s (Two-Factor) Theory. This implies that notwithstanding the age of the Two-Factor theory, it is still significant for managing employee retention in today’s rapidly expanding service- and knowledge-based organizations.

    Download full text (pdf)
    FULLTEXT01
  • 252.
    Ahmad, Khelil
    et al.
    Blekinge Institute of Technology.
    Algalali, Majid
    Blekinge Institute of Technology.
    Närståendes upplevelser av att vårda en familjemedlem med Alzheimers sjukdom: En allmän litteraturöversikt2022Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Bakgrund: Den mest förekommande demenssjukdomen är Alzheimers sjukdom. Alzheimers drabbar cirka 25,000 personer årligen i Sverige. Sjukdomen orsakas av olika anledningar men de främsta riskfaktorerna är högt blodtryck, hög ålder och ärftlighet. Sjukdomen orsakar en markant kognitiv försämring som påverkar beteendet, uppskattning, tolkning, individens minne och motoriska funktioner. Sjukdomen påverkar inte bara den insjuknande utan även närstående. När den insjuknande personen försämras i samband med sjukdomens progression ökar kraven på närstående. Sjuksköterskan har därför en väsentlig roll att stödja både den insjuknande och närstående för att kunna bevara deras hälsa och välbefinnande. 

    Syfte: Att beskriva närståendes upplevelser av att vårda en familjemedlem med Alzheimers sjukdom.

    Metod: Studien är en allmän litteraturöversikt med kvalitativ ansats. Artiklarna som ingår i studien har en kvalitativ metod. De valda artiklarna analyserades utifrån Fribergs (2017b) femstegs analysmodell för att skapa en tydlig och förståelig framställning av innehållet och resultatet.

    Resultat: Resultatet ledde till tre olika huvudteman samt tre underteman som framtogs av 11 olika vetenskapliga artiklar med kvalitativ metod. Artiklarna förtydligar närståendes upplevelser och komplikationer som uppstår vid vårdandet av en familjemedlem med Alzheimers sjukdom. Resultatet grundar sig på tre huvudteman som är följande: Att uppleva rollförändring, Emotionella upplevelser samt Att uppleva behov av kunskap och stöd. Under huvudtemat Emotionella upplevelser framkom tre underteman som är Upplevelser av rädsla och oro inför framtiden, Upplevelse av sorg samt Upplevelse av isolering.

    Slutsats: I resultatet tydliggörs behovet av ett anpassat stöd som närstående behöver inför den nya livsförändringen som sker i samband med rollförändringen. I samband med den nya livsförändringen beskriver närståendes upplevelser av sorg, rädsla och oro inför framtiden samt upplevelser av isolering vid tillagandet av den nya rollen. Närstående lyfter vikten av kunskap och stöd för att kunna erbjuda den insjuknade familjemedlemmen en god omvårdnad. Sjuksköterskan har en fundamental roll i att stötta närstående för att kunna vårda den insjuknande familjemedlemmen. För ett ökat välbefinnande hos närstående och den insjuknade familjemedlemmen bör sjuksköterskan erbjuda ett anpassat stöd, informera samt utveckla närståendes kunskaper och möjligheter.

    Download full text (pdf)
    fulltext
  • 253.
    AHMAD, MUHAMMAD ZEESHAN
    Blekinge Institute of Technology, School of Engineering.
    Comparative Analysis of Iptables and Shorewall2012Student thesis
    Abstract [en]

    The use of internet has increased over the past years. Many users may not have good intentions. Some people use the internet to gain access to the unauthorized information. Although absolute security of information is not possible for any network connected to the Internet however, firewalls make an important contribution to the network security. A firewall is a barrier placed between the network and the outside world to prevent the unwanted and potentially damaging intrusion of the network. This thesis compares the performance of Linux packet filtering firewalls, i.e. iptables and shorewall. The firewall performance testing helps in selecting the right firewall as needed. In addition, it highlights the strength and weakness of each firewall. Both firewalls were tested by using the identical parameters. During the experiments, recommended benchmarking methodology for firewall performance testing is taken into account as described in RFC 3511. The comparison process includes experiments which are performed by using different tools. To validate the effectiveness of firewalls, several performance metrics such as throughput, latency, connection establishment and teardown rate, HTTP transfer rate and system resource consumption are used. The experimental results indicate that the performance of Iptables firewall decreases as compared to shorewall in all the aspects taken into account. All the selected metrics show that large numbers of filtering rules have a negative impact on the performance of both firewalls. However, UDP throughput is not affected by the number of filtering rules. The experimental results also indicate that traffic sent with different packet sizes do not affect the performance of firewalls.

    Download full text (pdf)
    FULLTEXT01
  • 254.
    Ahmad, Nadeem
    et al.
    Blekinge Institute of Technology, School of Computing.
    Habib, M. Kashif
    Blekinge Institute of Technology, School of Computing.
    Analysis of Network Security Threats and Vulnerabilities by Development & Implementation of a Security Network Monitoring Solution2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Communication of confidential data over the internet is becoming more frequent every day. Individuals and organizations are sending their confidential data electronically. It is also common that hackers target these networks. In current times, protecting the data, software and hardware from viruses is, now more than ever, a need and not just a concern. What you need to know about networks these days? How security is implemented to ensure a network? How is security managed? In this paper we will try to address the above questions and give an idea of where we are now standing with the security of the network.

    Download full text (pdf)
    FULLTEXT01
  • 255.
    Ahmad, Naseer
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Security Issues in Wireless Systems2009Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    ireless Communication is one of the fields of Telecommunications which is growing with the tremendous speed. With the passage of time wireless communication devices are becoming more and more common. It is not only the technology of business but now people are using it to perform their daily tasks, be it for calling, shopping, checking their emails or transfer their money. Wireless communication devices include cellular phones, cordless phones and satellite phones, smart phones like Personal Digital Assistants (PDA), two way pagers, and lots of their devices are on their way to improve this wireless world. In order to establish two way communications, a wireless link may be using radio waves or Infrared light. The Wireless communication technologies have become increasingly popular in our everyday life. The hand held devices like Personal Digital Assistants (PDA) allow the users to access calendars, mails, addresses, phone number lists and the internet. Personal digital assistants (PDA) and smart phones can store large amounts of data and connect to a broad spectrum of networks, making them as important and sensitive computing platforms as laptop PCs when it comes to an organization’s security plan. Today’s mobile devices offer many benefits to enterprises. Mobile phones, hand held computers and other wireless systems are becoming a tempting target for virus writers. Mobile devices are the new frontier for viruses, spam and other potential security threats. Most viruses, Trojans and worms have already been created that exploit vulnerabilities. With an increasing amount of information being sent through wireless channels, new threats are opening up. Viruses have been growing fast as handsets increasingly resemble small computers that connect with each other and the internet. Hackers have also discovered that many corporate wireless local area networks (WLAN) in major cities were not properly secured. Mobile phone operators say that it is only a matter of time before the wireless world is hit by the same sorts of viruses and worms that attack computer software.

    Download full text (pdf)
    FULLTEXT01
  • 256.
    Ahmad, Raheel
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    On the Scalability of Four Multi-Agent Architectures for Load Control Management in Intelligent Networks2003Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Paralleling the rapid advancement in the network evolution is the need for advanced network traffic management surveillance. The increasing number and variety of services being offered by communication networks has fuelled the demand for optimized load management strategies. The problem of Load Control Management in Intelligent Networks has been studied previously and four Multi-Agent architectures have been proposed. The objective of this thesis is to investigate one of the quality attributes namely, scalability of the four Multi-Agent architectures. The focus of this research would be to resize the network and study the performance of the different architectures in terms of Load Control Management through different scalability attributes. The analysis has been based on experimentation through simulations. It has been revealed through the results that different architectures exhibit different performance behaviors for various scalability attributes at different network sizes. It has been observed that there exists a trade-off in different scalability attributes as the network grows. The factors affecting the network performance at different network settings have been observed. Based on the results from this study it would be easier to design similar networks for optimal performance by controlling the influencing factors and considering the trade-offs involved.

    Download full text (pdf)
    FULLTEXT01
  • 257.
    Ahmad, Saleem Zubair
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Analyzing Suitability of SysML for System Engineering Applications2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    During last decade UML have to face different tricky challenges. For instance as a single unified, general purpose modeling language it should offer simple and explicit semantic which can be applicable to wide range of domains. Due to significant shift of focus from software to system “software-centric” attitude of UML has been exposed. So need of certain domain specific language is always there which can address problems of system rather then software only i.e. motivation for SysML. In this thesis SysML is evaluated to analyze its suitability for system engineering applications. A evaluation criteria is established, through which appropriateness of SysML is observed over system development life cycle. The study is conducted by taking case example of real life i.e. automobile product. Results of research not only provide an opportunity to get inside into SysML architecture but also offer an idea of SysML appropriateness for multidisciplinary product development

    Download full text (pdf)
    FULLTEXT01
  • 258.
    Ahmad, Waqar
    et al.
    Blekinge Institute of Technology, School of Computing.
    Riaz, Asim
    Blekinge Institute of Technology, School of Computing.
    Predicting Friendship Levels in Online Social Networks2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Abstract Context: Online social networks such as Facebook, Twitter, and MySpace have become the preferred interaction, entertainment and socializing facility on the Internet. However, these social network services also bring privacy issues in more limelight than ever. Several privacy leakage problems are highlighted in the literature with a variety of suggested countermeasures. Most of these measures further add complexity and management overhead for the user. One ignored aspect with the architecture of online social networks is that they do not offer any mechanism to calculate the strength of relationship between individuals. This information is quite useful to identify possible privacy threats. Objectives: In this study, we identify users’ privacy concerns and their satisfaction regarding privacy control measures provided by online social networks. Furthermore, this study explores data mining techniques to predict the levels/intensity of friendship in online social networks. This study also proposes a technique to utilize predicted friendship levels for privacy preservation in a semi-automatic privacy framework. Methods: An online survey is conducted to analyze Facebook users’ concerns as well as their interaction behavior with their good friends. On the basis of survey results, an experiment is performed to justify practical demonstration of data mining phases. Results: We found that users are concerned to save their private data. As a precautionary measure, they restrain to show their private information on Facebook due to privacy leakage fears. Additionally, individuals also perform some actions which they also feel as privacy vulnerability. This study further identifies that the importance of interaction type varies while communication. This research also discovered, “mutual friends” and “profile visits”, the two non-interaction based estimation metrics. Finally, this study also found an excellent performance of J48 and Naïve Bayes algorithms to classify friendship levels. Conclusions: The users are not satisfied with the privacy measures provided by the online social networks. We establish that the online social networks should offer a privacy mechanism which does not require a lot of privacy control effort from the users. This study also concludes that factors such as current status, interaction type need to be considered with the interaction count method in order to improve its performance. Furthermore, data mining classification algorithms are tailor-made for the prediction of friendship levels.

    Download full text (pdf)
    FULLTEXT01
  • 259.
    Ahmad, Waqas
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Aslam, Muhammad Kashif
    Blekinge Institute of Technology, School of Engineering.
    An investigation of Routing Protocols in Wireless Mesh Networks (WMNs) under certain Parameters2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Wireless Mesh Networks (WMNs) are bringing revolutionary change in the field of wireless networking. It is a trustworthy technology in applications like broadband home networking, network management and latest transportation systems. WMNs consist of mesh routers, mesh clients and gateways. It is a special kind of wireless Ad-hoc networks. One of the issues in WMNs is resource management which includes routing and for routing there are particular routing protocols that gives better performance when checked with certain parameters. Parameters in WMNs include delay, throughput, network load etc. There are two types of routing protocols i.e. reactive protocols and proactive protocols. Three routing protocols AODV, DSR and OLSR have been tested in WMNs under certain parameters which are delay, throughput and network load. The testing of these protocols will be performed in the Optimized Network Evaluation Tool (OPNET) Modeler 14.5. The obtained results from OPNET will be displayed in this thesis in the form of graphs. This thesis will help in validating which routing protocol will give the best performance under the assumed conditions. Moreover this thesis report will help in doing more research in future in this area and help in generating new ideas in this research area that will enhance and bring new features in WMNs.

    Download full text (pdf)
    FULLTEXT01
  • 260.
    Ahmad, Zunnurain
    Blekinge Institute of Technology, School of Engineering.
    Design and Implementation of Quasi Planar K-Band Array Antenna Based on Travelling Wave Structures2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Designing antenna arrays based on travelling wave structures has been studied extensively during the past couple of decades and literature on several array topologies is present. It has been an active area of research as constant improvement of antenna arrays is desired for different communication systems developed. Slotted waveguide antennas are one form of travelling wave structures which is adapted in this study due to several advantages offered over other planar array structures. Waveguide slots have been used for a couple of decades as radiating elements. Several design studies have been carried out regarding use of slots with different orientations and geometry and cascading them together to be used as array antennas. Waveguide antennas are preferred as they provide low losses in the feed structure and also offer good radiation characteristics. This study provides a design procedure for implementing a circular polarized planar antenna array based on slotted waveguide structures. The antenna is designed to work in the 19.7 - 20.2 GHz range which is the operating frequency for the downlink of satellites.

    Download full text (pdf)
    FULLTEXT01
  • 261.
    Ahmadi Mehri, Vida
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    An Investigation of CPU utilization relationship between host and guests in a Cloud infrastructure2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Cloud computing stands as a revolution in IT world in recent years. This technology facilitates resource sharing by reducing hardware costs for business users and promises energy efficiency and better resource utilization to the service providers. CPU utilization is a key metric considered in resource management across clouds.

    The main goal of this thesis study is directed towards investigating CPU utilization behavior with regard to host and guest, which would help us in understanding the relationship between them. It is expected that perception of these relationships would be helpful in resource management.

    Working towards our goal, the methodology we adopted is experi- mental research. This involves experimental modeling, measurements and observations from the results. The experimental setup covers sev- eral complex scenarios including cloud and a standalone virtualization system. The results are further analyzed for a visual correlation.

    Results show that CPU utilization in cloud and virtualization sce- nario coincides. More experimental scenarios are designed based on the first observations. The obtaining results show the irregular behav- ior between PM and VM in variable workload.

    CPU utilization retrieved from both cloud and a standalone system is similar. 100% workload situations showed that CPU utilization is constant with no correlation co-efficient obtained. Lower workloads showed (more/less) correlation in most of the cases in our correlation analysis. It is expected that more number of iterations can possibly vary the output. Further analysis of these relationships for proper resource management techniques will be considered. 

    Download full text (pdf)
    fulltext
  • 262.
    Ahmadi Mehri, Vida
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Towards Automated Context-aware Vulnerability Risk Management2023Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    The information security landscape continually evolves with increasing publicly known vulnerabilities (e.g., 25064 new vulnerabilities in 2022). Vulnerabilities play a prominent role in all types of security related attacks, including ransomware and data breaches. Vulnerability Risk Management (VRM) is an essential cyber defense mechanism to eliminate or reduce attack surfaces in information technology. VRM is a continuous procedure of identification, classification, evaluation, and remediation of vulnerabilities. The traditional VRM procedure is time-consuming as classification, evaluation, and remediation require skills and knowledge of specific computer systems, software, network, and security policies. Activities requiring human input slow down the VRM process, increasing the risk of exploiting a vulnerability.

    The thesis introduces the Automated Context-aware Vulnerability Risk Management (ACVRM) methodology to improve VRM procedures by automating the entire VRM cycle and reducing the procedure time and experts' intervention. ACVRM focuses on the challenging stages (i.e., classification, evaluation, and remediation) of VRM to support security experts in promptly prioritizing and patching the vulnerabilities. 

    ACVRM concept is designed and implemented in a test environment for proof of concept. The efficiency of patch prioritization by ACVRM compared against a commercial vulnerability management tool (i.e., Rudder). ACVRM prioritized the vulnerability based on the patch score (i.e., the numeric representation of the vulnerability characteristic and the risk), the historical data, and dependencies. The experiments indicate that ACVRM could rank the vulnerabilities in the organization's context by weighting the criteria used in patch score calculation. The automated patch deployment is implemented with three use cases to investigate the impact of learning from historical events and dependencies on the success rate of the patch and human intervention. Our finding shows that ACVRM reduced the need for human actions, increased the ratio of successfully patched vulnerabilities, and decreased the cycle time of VRM process.

    Download full text (pdf)
    fulltext
  • 263.
    Ahmadi Mehri, Vida
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Towards Secure Collaborative AI Service Chains2019Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    At present, Artificial Intelligence (AI) systems have been adopted in many different domains such as healthcare, robotics, automotive, telecommunication systems, security, and finance for integrating intelligence in their services and applications. The intelligent personal assistant such as Siri and Alexa are examples of AI systems making an impact on our daily lives. Since many AI systems are data-driven systems, they require large volumes of data for training and validation, advanced algorithms, computing power and storage in their development process. Collaboration in the AI development process (AI engineering process) will reduce cost and time for the AI applications in the market. However, collaboration introduces the concern of privacy and piracy of intellectual properties, which can be caused by the actors who collaborate in the engineering process.  This work investigates the non-functional requirements, such as privacy and security, for enabling collaboration in AI service chains. It proposes an architectural design approach for collaborative AI engineering and explores the concept of the pipeline (service chain) for chaining AI functions. In order to enable controlled collaboration between AI artefacts in a pipeline, this work makes use of virtualisation technology to define and implement Virtual Premises (VPs), which act as protection wrappers for AI pipelines. A VP is a virtual policy enforcement point for a pipeline and requires access permission and authenticity for each element in a pipeline before the pipeline can be used.  Furthermore, the proposed architecture is evaluated in use-case approach that enables quick detection of design flaw during the initial stage of implementation. To evaluate the security level and compliance with security requirements, threat modeling was used to identify potential threats and vulnerabilities of the system and analyses their possible effects. The output of threat modeling was used to define countermeasure to threats related to unauthorised access and execution of AI artefacts.

    Download full text (pdf)
    fulltext
  • 264.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Arlos, Patrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Casalicchio, Emiliano
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Automated Context-Aware Vulnerability Risk Management for Patch Prioritization2022In: Electronics, E-ISSN 2079-9292, Vol. 11, no 21, article id 3580Article in journal (Refereed)
    Abstract [en]

    The information-security landscape continuously evolves by discovering new vulnerabilities daily and sophisticated exploit tools. Vulnerability risk management (VRM) is the most crucial cyber defense to eliminate attack surfaces in IT environments. VRM is a cyclical practice of identifying, classifying, evaluating, and remediating vulnerabilities. The evaluation stage of VRM is neither automated nor cost-effective, as it demands great manual administrative efforts to prioritize the patch. Therefore, there is an urgent need to improve the VRM procedure by automating the entire VRM cycle in the context of a given organization. The authors propose automated context-aware VRM (ACVRM), to address the above challenges. This study defines the criteria to consider in the evaluation stage of ACVRM to prioritize the patching. Moreover, patch prioritization is customized in an organization’s context by allowing the organization to select the vulnerability management mode and weigh the selected criteria. Specifically, this study considers four vulnerability evaluation cases: (i) evaluation criteria are weighted homogeneously; (ii) attack complexity and availability are not considered important criteria; (iii) the security score is the only important criteria considered; and (iv) criteria are weighted based on the organization’s risk appetite. The result verifies the proposed solution’s efficiency compared with the Rudder vulnerability management tool (CVE-plugin). While Rudder produces a ranking independent from the scenario, ACVRM can sort vulnerabilities according to the organization’s criteria and context. Moreover, while Rudder randomly sorts vulnerabilities with the same patch score, ACVRM sorts them according to their age, giving a higher security score to older publicly known vulnerabilities. © 2022 by the authors.

    Download full text (pdf)
    fulltext
  • 265.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Arlos, Patrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Casalicchio, Emiliano
    Sapienza University of Rome, Italy.
    Automated Patch Management: An Empirical Evaluation Study2023In: Proceedings of the 2023 IEEE International Conference on Cyber Security and Resilience, CSR 2023, IEEE, 2023, p. 321-328Conference paper (Refereed)
    Abstract [en]

    Vulnerability patch management is one of IT organizations' most complex issues due to the increasing number of publicly known vulnerabilities and explicit patch deadlines for compliance. Patch management requires human involvement in testing, deploying, and verifying the patch and its potential side effects. Hence, there is a need to automate the patch management procedure to keep the patch deadline with a limited number of available experts. This study proposed and implemented an automated patch management procedure to address mentioned challenges. The method also includes logic to automatically handle errors that might occur in patch deployment and verification. Moreover, the authors added an automated review step before patch management to adjust the patch prioritization list if multiple cumulative patches or dependencies are detected. The result indicated that our method reduced the need for human intervention, increased the ratio of successfully patched vulnerabilities, and decreased the execution time of vulnerability risk management.

    Download full text (pdf)
    fulltext
  • 266.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Arlos, Patrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Casalicchio, Emiliano
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Normalization Framework for Vulnerability Risk Management in Cloud2021In: Proceedings - 2021 International Conference on Future Internet of Things and Cloud, FiCloud 2021, IEEE, 2021, p. 99-106Conference paper (Refereed)
    Abstract [en]

    Vulnerability Risk Management (VRM) is a critical element in cloud security that directly impacts cloud providers’ security assurance levels. Today, VRM is a challenging process because of the dramatic increase of known vulnerabilities (+26% in the last five years), and because it is even more dependent on the organization’s context. Moreover, the vulnerability’s severity score depends on the Vulnerability Database (VD) selected as a reference in VRM. All these factors introduce a new challenge for security specialists in evaluating and patching the vulnerabilities. This study provides a framework to improve the classification and evaluation phases in vulnerability risk management while using multiple vulnerability databases as a reference. Our solution normalizes the severity score of each vulnerability based on the selected security assurance level. The results of our study highlighted the role of the vulnerability databases in patch prioritization, showing the advantage of using multiple VDs.

    Download full text (pdf)
    fulltext
  • 267.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science. City Network International AB, Sweden.
    Arlos, Patrik
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Casalicchio, Emiliano
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science. Sapienza University of Rome, ITA.
    Normalization of Severity Rating for Automated Context-aware Vulnerability Risk Management2020In: Proceedings - 2020 IEEE International Conference on Autonomic Computing and Self-Organizing Systems Companion, ACSOS-C 2020, Institute of Electrical and Electronics Engineers (IEEE), 2020, p. 200-205, article id 9196350Conference paper (Refereed)
    Abstract [en]

    In the last three years, the unprecedented increase in discovered vulnerabilities ranked with critical and high severity raise new challenges in Vulnerability Risk Management (VRM). Indeed, identifying, analyzing and remediating this high rate of vulnerabilities is labour intensive, especially for enterprises dealing with complex computing infrastructures such as Infrastructure-as-a-Service providers. Hence there is a demand for new criteria to prioritize vulnerabilities remediation and new automated/autonomic approaches to VRM.

    In this paper, we address the above challenge proposing an Automated Context-aware Vulnerability Risk Management (AC- VRM) methodology that aims: to reduce the labour intensive tasks of security experts; to prioritize vulnerability remediation on the basis of the organization context rather than risk severity only. The proposed solution considers multiple vulnerabilities databases to have a great coverage on known vulnerabilities and to determine the vulnerability rank. After the description of the new VRM methodology, we focus on the problem of obtaining a single vulnerability score by normalization and fusion of ranks obtained from multiple vulnerabilities databases. Our solution is a parametric normalization that accounts for organization needs/specifications.

    Download full text (pdf)
    fulltext
  • 268.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Ilie, Dragos
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Designing a Secure IoT System Architecture from a Virtual Premise for a Collaborative AI Lab2019Conference paper (Refereed)
    Abstract [en]

    IoT systems are increasingly composed out of flexible, programmable, virtualised, and arbitrarily chained IoT elements and services using portable code. Moreover, they might be sliced, i.e. allowing multiple logical IoT systems (network + application) to run on top of a shared physical network and compute infrastructure. However, implementing and designing particularly security mechanisms for such IoT systems is challenging since a) promising technologies are still maturing, and b) the relationships among the many requirements, technologies and components are difficult to model a-priori.

    The aim of the paper is to define design cues for the security architecture and mechanisms of future, virtualised, arbitrarily chained, and eventually sliced IoT systems. Hereby, our focus is laid on the authorisation and authentication of user, host, and code integrity in these virtualised systems. The design cues are derived from the design and implementation of a secure virtual environment for distributed and collaborative AI system engineering using so called AI pipelines. The pipelines apply chained virtual elements and services and facilitate the slicing of the system. The virtual environment is denoted for short as the virtual premise (VP). The use-case of the VP for AI design provides insight into the complex interactions in the architecture, leading us to believe that the VP concept can be generalised to the IoT systems mentioned above. In addition, the use-case permits to derive, implement, and test solutions. This paper describes the flexible architecture of the VP and the design and implementation of access and execution control in virtual and containerised environments. 

    Download full text (pdf)
    fulltext
  • 269.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Ilie, Dragos
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Privacy and DRM Requirements for Collaborative Development of AI Application2019In: ACM International Conference Proceeding Series, Association for Computing Machinery (ACM), 2019, article id 3233268Conference paper (Refereed)
    Abstract [en]

    The use of data is essential for the capabilities of Data-driven Artificial intelligence (AI), Deep Learning and Big Data analysis techniques. This data usage, however, raises intrinsically the concerns on data privacy. In addition, supporting collaborative development of AI applications across organisations has become a major need in AI system design. Digital Rights Management (DRM) is required to protect intellectual property in such collaboration. As a consequence of DRM, privacy threats and privacy-enforcing mechanisms will interact with each other.

    This paper describes the privacy and DRM requirements in collaborative AI system design using AI pipelines. It describes the relationships between DRM and privacy and outlines the threats against these non-functional features. Finally, the paper provides first security architecture to protect against the threats on DRM and privacy in collaborative AI design using AI pipelines. 

    Download full text (pdf)
    fulltext
  • 270.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Ilie, Dragos
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Towards Privacy Requirements for Collaborative Development of AI Applications2018In: 14th Swedish National Computer Networking Workshop (SNCNW), 2018Conference paper (Refereed)
    Abstract [en]

    The use of data is essential for the capabilities of Data- driven Artificial intelligence (AI), Deep Learning and Big Data analysis techniques. The use of data, however, raises intrinsically the concern of the data privacy, in particular for the individuals that provide data. Hence, data privacy is considered as one of the main non-functional features of the Next Generation Internet. This paper describes the privacy challenges and requirements for collaborative AI application development. We investigate the constraints of using digital right management for supporting collaboration to address the privacy requirements in the regulation.

    Download full text (pdf)
    fulltext
  • 271.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Flexible Privacy and High Trust in the Next Generation Internet: The Use Case of a Cloud-based Marketplace for AI2017Conference paper (Refereed)
    Abstract [en]

    Cloudified architectures facilitate resource ac-cess and sharing which is independent from physical lo-cations. They permit high availability of resources at lowoperational costs. These advantages, however, do not comefor free. End users might fear that they lose control overthe location of their data and, thus, of their autonomy indeciding to whom the data is communicate to. Thus, strongprivacy and trust concerns arise for end users.In this work we will review and investigate privacy andtrust requirements for Cloud systems in general and for acloud-based marketplace (CMP) for AI in particular. We willinvestigate whether and how the current privacy and trustdimensions can be applied to Clouds and for the design ofa CMP. We also propose the concept of a "virtual premise"for enabling "Privacy-by-Design" [1] in Clouds. The ideaof a "virtual premise" might probably not be a universalsolution for any privacy requirement. However, we expectthat it provides flexibility in designing privacy in Cloudsand thus leading to higher trust.

    Download full text (pdf)
    fulltext
  • 272.
    Ahmadi Mehri, Vida
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Tutschku, Kurt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Privacy and trust in cloud-based marketplaces for AI and data resources2017In: IFIP Advances in Information and Communication Technology, Springer New York LLC , 2017, Vol. 505, p. 223-225Conference paper (Refereed)
    Abstract [en]

    The processing of the huge amounts of information from the Internet of Things (IoT) has become challenging. Artificial Intelligence (AI) techniques have been developed to handle this task efficiently. However, they require annotated data sets for training, while manual preprocessing of the data sets is costly. The H2020 project “Bonseyes” has suggested a “Market Place for AI”, where the stakeholders can engage trustfully in business around AI resources and data sets. The MP permits trading of resources that have high privacy requirements (e.g. data sets containing patient medical information) as well as ones with low requirements (e.g. fuel consumption of cars) for the sake of its generality. In this abstract we review trust and privacy definitions and provide a first requirement analysis for them with regards to Cloud-based Market Places (CMPs). The comparison of definitions and requirements allows for the identification of the research gap that will be addressed by the main authors PhD project. © IFIP International Federation for Information Processing 2017.

  • 273.
    Ahmed, Abdifatah
    et al.
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Lindhe, Magnus
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Efficient And Maintainable Test Automation2002Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    More and more companies experience problems with maintainability and time-consuming development of automated testing tools. The MPC department at Ericsson Software Technology AB use methods and tools often developed during time pressure that results in time-consuming testing and requires more effort and resources than planned. The tools are also such nature that they are hard to expand, maintain and in some cases they have been thrown out between releases. For this reason, we could identify two major objectives that MPC want to achieve; efficient and maintainable test automation. Efficient test automation is related to mainly how to perform tests with less effort, or in a shorter time. Maintainable test automation aims to keep tests up to date with the software. In order to decide how to achieve these objectives, we decided to investigate which test to automate, what should be improved in the testing process, what techniques to use, and finally whether or not the use of automated testing can reduce the cost of testing. These issues will be discussed in this paper.

    Download full text (pdf)
    FULLTEXT01
  • 274.
    Ahmed, Adnan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Hussain, Syed Shahram
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Meta-Model of Resilient information System2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The role of information systems has become very important in today’s world. It is not only the business organizations who use information systems but the governments also posses’ very critical information systems. The need is to make information systems available at all times under any situation. Information systems must have the capabilities to resist against the dangers to its services,performance & existence, and recover to its normal working state with the available resources in catastrophic situations. The information systems with such a capability can be called resilient information systems. This thesis is written to define resilient information systems, suggest its meta-model and to explain how existing technologies can be utilized for the development of resilient information system.

    Download full text (pdf)
    FULLTEXT01
  • 275.
    ahmed, amar
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Performance and Modeling of SIP Session Setup2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    During the recent last years, transport of multimedia sessions, such as audio streams and video conferences, over IP has acquired a lot of attention since most of communication technologies are migrating to work over IP. However, sending media streams over IP networks has encountered some problems related to signaling issues. The ongoing research in this area has produced some solutions to this subject. Internet Engineering Task Force (IETF) has introduced Session Initiation Protocol (SIP), which has proved to be an efficient protocol for controlling sessions over IP. While a great deal of research performed in evaluating the performance of SIP and comparing it with its competent protocols such as H.323, studying the delay caused by initiating the session has acquired less attention. In this document, we have addressed the SIP session setup delay problem. In the lab, we have built up a test bed for running several SIP session scenarios. Using different models for those scenarios, we have measured session setup delays for all used models. The analysis performed for each model showed that we could propose some models to be applied for SIP session setup delay components.

    Download full text (pdf)
    FULLTEXT01
  • 276.
    Ahmed, Ashraf AwadElkarim Widaa
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Makki, Ahmed Hamza Ibrahim
    Blekinge Institute of Technology, School of Engineering.
    Performance Evaluation of Uplink Multiple Access Techniques in LTE Mobile Communication System2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The User Equipments (UE) nowadays are able to provide various internet applications and services that raise the demand for high speed data transfer and Quality of Service (QoS). Accordingly, next generation mobile communication systems driven by these demands are expected to provide higher data rates and better link quality compared to the existing systems. Orthogonal Frequency Division Multiple Access (OFDMA) and Single Carrier Frequency Division Multiple Access (SC-FDMA) are strong multiple access candidates for the uplink of the International Mobile Telecommunications-Advanced (IMT-Advanced). These multiple access techniques in combination with other promising technologies such as multi-hops transmission and Multiple-Input-Multiple-Output (MIMO) will be utilized to reach the targeted IMT-Advanced system performance. In this thesis, OFDMA and SC-FDMA are adopted and studied in the uplink of Long Term Evolution (LTE). Two transmission scenarios are considered, namely the single hop transmission and the relay assisted transmission (two hops). In addition, a hybrid multiple access technique that combines the advantages of OFDMA and SC-FDMA in term of low Peak-to-Average Power Ratio (PAPR) and better link performance (in terms of Symbol Error Rate (SER)) has been proposed in relay assisted transmission scenario. Simulation results show that the proposed hybrid technique achieves better end-to-end link performance in comparison to the pure SC-FDMA technique and maintains the same PAPR value in access link. In addition, a lower PAPR is achieved compared to OFDMA case, which is an important merit in the uplink transmission due to the UE’s power resources constraint (limited battery power).

    Download full text (pdf)
    FULLTEXT01
  • 277.
    Ahmed, Ishtiaque
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Study of the Local Backprojection Algorithm for Image Formation in Ultra Wideband Synthetic Aperture Radar2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The purpose of this thesis project is to study and evaluate a UWB Synthetic Aperture Radar (SAR) data image formation algorithm, that was previously less familiar and, that has recently got much attention in this field. Certain properties of it made it acquire a status in radar signal processing branch. This is a fast time-domain algorithm named Local Backprojection (LBP). The LBP algorithm has been implemented for SAR image formation. The algorithm has been simulated in MATLAB using standard values of pertinent parameters. Later, an evaluation of the LBP algorithm has been performed and all the comments, estimation and judgment have been done on the basis of the resulting images. The LBP has also been compared with the basic time-domain algorithm Global Backprojection (GBP) with respect to the SAR images. The specialty of LBP algorithm is in its reduced computational load than in GBP. LBP is a two stage algorithm — it forms the beam first for a particular subimage and, in a later stage, forms the image of that subimage area. The signal data collected from the target is processed and backprojected locally for every subimage individually. This is the reason of naming it Local backprojection. After the formation of all subimages, these are arranged and combined coherently to form the full SAR image.

    Download full text (pdf)
    FULLTEXT01
  • 278.
    Ahmed, Israr
    et al.
    Blekinge Institute of Technology, School of Computing.
    Nadeem, Shahid
    Blekinge Institute of Technology, School of Computing.
    Minimizing Defects Originating from Elicitation, Analysis and Negotiation (E and A&N) Phase in Bespoke Requirements Engineering2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Defect prevention (DP) in early stages of software development life cycle (SDLC) is very cost effective than in later stages. The requirements elicitation and analysis & negotiation (E and A&N) phases in requirements engineering (RE) process are very critical and are major source of requirements defects. A poor E and A&N process may lead to a software requirements specifications (SRS) full of defects like missing, ambiguous, inconsistent, misunderstood, and incomplete requirements. If these defects are identified and fixed in later stages of SDLC then they could cause major rework by spending extra cost and effort. Organizations are spending about half of their total project budget on avoidable rework and majority of defects originate from RE activities. This study is an attempt to prevent requirements level defects from penetrates into later stages of SDLC. For this purpose empirical and literature studies are presented in this thesis. The empirical study is carried out with the help of six companies from Pakistan & Sweden by conducting interviews and literature study is done by using literature reviews. This study explores the most common requirements defect types, their reasons, severity level of defects (i.e. major or minor), DP techniques (DPTs) & methods, defect identification techniques that have been using in software development industry and problems in these DPTs. This study also describes possible major differences between Swedish and Pakistani software companies in terms of defect types and rate of defects originating from E and A&N phases. On the bases of study results, some solutions have been proposed to prevent requirements defects during the RE process. In this way we can minimize defects originating from E and A&N phases of RE in the bespoke requirements engineering (BESRE).

    Download full text (pdf)
    FULLTEXT01
  • 279.
    Ahmed, Juber
    Blekinge Institute of Technology, School of Management.
    Client Information Needs of MFIs: A Case Study of ASA Bangladesh2010Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Abstract Title: Client Information Needs of MFIs: A Case Study of ASA Bangladesh Author: Juber Ahmed Academic Advisor: Dr. Klaus Solberg Søilen Department: School of Management, Blekinge Institute of Technology Course: Master Thesis in Business Administration Purpose: To enrich the knowledge base of client’s needs of financial services and assessing the tools MFIs used to collect clients’ information and how they utilized the information for developing new products and services or modifying existing products and services or their terms and conditions to meet the needs of financial services of their clientele. Also how MFIs organized and managed the information and how they categorized their clients using that information. Method: The investigation conducted from both a theoretical and an empirical point of view. The deductive approach used for the study and the case study method deployed. I studied ASA which is an MFI renowned in Bangladesh and beyond. At first, I had gone through a secondary research for collecting a number of successful methods and standard types of information used by successful MFIs from existing literature. In primary research, I interviewed 10 Managers (Assistant Directors) for ASA to determine which of the methods found in the literature were more effective for collecting clients’ information for them and also asked them to add their ideas to the list. At last I asked interviewees to rate the methods and results presented in this paper. Theory: This study was an exploratory one where I discussed the related aspects for the study - Microfinance, Client Assessment, Clients of Microfinance, Information needs and Management Information System. Findings: The study showed that ASA utilized client information for developing their credit products and services and based on number of loans taken by the clients they categorized their clients and modified or developed new products and services for each category of clients. Although ASA executed several tools for collecting client information but the managers think that their staffs’ collection of information from regular meeting with clients was more effective than others for modifying products’ terms and conditions and modifying or developing new products and services to their women and small enterprise clients. The conducted study also revealed that in ASA impact study was necessary to know clients’ overall level of satisfaction but management needed specific information on what aspects of ASA and its credit products and services clients preferred and did not prefer and the reasons of the preferences. Also they needed action plan to address clients’ specific concerns, so they needed the information on a continual basis and they were successful to achieve this continuous flow of information. For ASA, the best way to get this type of information would be through client satisfaction Focus Group Discussions (FGDs), although they utilized several tools but not often as discussed in part 3 in chapter 5. ASA owned an MIS (AMMS) for monitoring and managing clients’ information and they utilized this to categorize their clients based on the collected information about their number of loans. Conclusion: This study revealed that ASA served only women and small enterprise clientele that included the vulnerable non-poor and could contribute to the profitability of ASA. There was no attempt to diversify the products to include all poor that should be the goal of microfinance to alleviate poverty. Moreover client treated as individual client but the loans used to fulfill household or family needs of the clients. There were tools for collecting information on household about impact of credit programs participation but they took seldom effort for collecting information of the household money management or in other words how they utilized the loans for variety of household needs. There is lack of access to a variety of financial services for poor clients, even though MFIs are mostly serving vulnerable non-poor instead of taking consideration of all categories of poor. It revealed from the study that MFIs could gain long term success by serving specific market segment but it should not be only focus of MFIs, their initiative should be to include all poor in their clients profile with a priority to a specific market segment. This could help them to become sustainable and to minimize risks by spreading it in different market segments. The study found that ASA considered FGDs as an effective tool for collecting clients’ information as their staffs and managers were familiar with this tool, moreover it was cost effective for them. It observed that they seldom followed Tool Selection Process and it was the top management that decided over the tools, the decision might influence by internal and external interest groups and the competition. MFIs should organize client information in a way so that they could be able to manipulate the specific client information to serve client better and to take effective decision, although it is imperative to argue that they may like to serve the wealthier clients. This research paper is also presenting some important findings from existing literature of microfinance and a number of recommendations based on the study experience and scholars opinions from existing microfinance study that may help MFIs to prepare themselves to adopt client-oriented approach by utilizing client assessment tools to fulfill the needs of financial services of their clients that may hopefully include all poor irrespective of their categories.

    Download full text (pdf)
    FULLTEXT01
  • 280.
    Ahmed, Kwaku
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    Hatira, Lamia
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    Valva, Paul
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    How can the construction industry in Ghana become sustainable?2014Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The Sub-Saharan African country of Ghana is growing at a rapid pace. The construction industry is striving to keep up with the increasing demand for housing and commercial and industrial space while simultaneously protecting the physical environment and social well-being of the country – a challenge becoming known in the industry as ‘sustainable construction.’ This paper proposes a strategic approach to manage these twin challenges, consisting of two parts: a building rating system and a participatory method called multi-stakeholder dialogue. The combination rating system and MSD process was presented to the industry to determine its potential effectiveness in assisting the industry to move towards sustainability. The industry’s response indicates that the proposal could be of value to the industry, with certain noted limitations. This paper describes the rating system-MSD proposal, the industry’s response, and implications for the construction industry in Ghana moving forward

    Download full text (pdf)
    FULLTEXT01
  • 281.
    Ahmed, Mamun
    Blekinge Institute of Technology, School of Engineering.
    Adaptive Sub band GSC Beam forming using Linear Microphone-Array for Noise Reduction/Speech Enhancement.2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    This project presents the description, design and the implementation of a 4-channel microphone array that is an adaptive sub-band generalized side lobe canceller (GSC) beam former uses for video conferencing, hands-free telephony etc, in a noisy environment for speech enhancement as well as noise suppression. The side lobe canceller evaluated with both Least Mean Square (LMS) and Normalized Least Mean Square (NLMS) adaptation. A testing structure is presented; which involves a linear 4-microphone array connected to collect the data. Tests were done using one target signal source and one noise source. In each microphone’s, data were collected via fractional time delay filtering then it is divided into sub-bands and applied GSC to each of the subsequent sub-bands. The overall Signal to Noise Ratio (SNR) improvement is determined from the main signal and noise input and output powers, with signal-only and noise-only as the input to the GSC. The NLMS algorithm significantly improves the speech quality with noise suppression levels up to 13 dB while LMS algorithm is giving up to 10 dB. All of the processing for this thesis is implemented on a computer using MATLAB and validated by considering different SNR measure under various types of blocking matrix, different step sizes, different noise locations and variable SNR with noise.

    Download full text (pdf)
    FULLTEXT01
  • 282.
    Ahmed, Mohammad Abdur Razzak and Rajib
    Blekinge Institute of Technology, School of Computing.
    Knowledge Management in Distributed Agile Projects2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Knowledge management (KM) is essential for success in Global Soft- ware Development (GSD); Distributed Software Development (DSD); or Global Software Engineering (GSE). Software organizations are managing knowledge in innovative ways to increase productivity. One of the major objectives of KM is to improve productivity through effective knowledge sharing and transfer. Therefore, to maintain effective knowledge sharing in distributed agile projects, practitioners need to adopt different types of knowledge sharing techniques and strategies. Distributed projects introduce new challenges to KM. So, practices that are used in agile teams become difficult to put into action in distributed development. Though, informal communication is the key enabler for knowledge sharing, when an agile project is distributed, informal communication and knowledge sharing are challenged by the low communication bandwidth between distributed team members, as well as by social and cultural distance. In the work presented in this thesis, we have made an overview of empirical studies of knowledge management in distributed agile projects. Based on the main theme of this study, we have categorized and reported our findings on major concepts that need empirical investigation. We have classified the main research theme in this thesis within two sub-themes: • RT1: Knowledge sharing activities in distributed agile projects. • RT2: Spatial knowledge sharing in a distributed agile project. The main contributions are: • C1: Empirical observations regarding knowledge sharing activities in distributed agile projects. • C2: Empirical observations regarding spatial knowledge sharing in a distributed agile project. • C3: Process improvement scope and guidelines for the studied project.

    Download full text (pdf)
    FULLTEXT01
  • 283.
    Ahmed, Murtaza Sheraz
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Ahmed, Khawaja Waqar
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Role of ICT in Combating Corruption and Improving Public Service Delivery, A case study of Punjab Information Technology Board2014Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Governments in both developing and developed countries provide services to its citizen however some provide better services than the others. Public services generally include facilities of health, education, electricity and water supply, social welfare, transportation, communication and other services. Countries face many challenges while providing these services to citizens and many developing countries have failed to provide the public services efficiently and are facing issues while providing service delivery to its citizens. One of the issues is corruption. The developing countries need to adopt e-governance and Information and Communication Technology (ICT) to overcome these issues. E-governance is a process of improving through Information Technology (IT), the way government works, shares information, interacts with clients (citizen, Business, Government) and provides services to clients. Our thesis will investigate whether ICT can combat corruption and improve Public Service Delivery (PSD) in developing countries. Findings of our study will help developing countries in combating corruption and improving public services. We have reviewed the literature related to public service delivery, corruption and its adverse effects, use of ICT and e-governance to combat corruption. We have adopted a case study approach to see what sorts of ICT initiatives are being taken by the government of Punjab to reduce corruption and improve the public services. In this regard we have explored the e-governance projects of PITB (Information Technology department of the Government of Punjab) including one of a recent project of PITB namely Citizen Feedback Model (CFM) which has got international attention worldwide for combating corruption and improving public services delivery. We have performed quantitative analysis on six months data set of CFM which was comprised of over one hundred and seventy thousand responses categorized in predefined categories of feedback. By doing the quantitative analysis of CFM data we have found that this particular e-governance project is reducing corruption of public officials and improving public service delivery. Further explanations are provided through information gathered by interviews. Finally we have tied our results to findings in the literature and suggested implications.

    Download full text (pdf)
    FULLTEXT01
  • 284.
    Ahmed, Nisar
    et al.
    Blekinge Institute of Technology, School of Computing.
    Yousaf, Shahid
    Blekinge Institute of Technology, School of Computing.
    For Improved Energy Economy – How Can Extended Smart Metering Be Displayed?2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: A District Heating System (DHS) uses a central heating plant to produce and distribute hot water in a community. Such a plant is connected with consumers’ premises to provide them with hot water and space heating facilities. Variations in the consumption of heat energy depend upon different factors like difference in energy prices, living standards, environmental effects and economical conditions etc. These factors can manage intelligently by advanced tools of Information and Communication Technology (ICT) such as smart metering. That is a new and emerging technology; used normally for metering of District Heating (DH), district cooling, electricity and gas. Traditional meters measures overall consumption of energy, in contrast smart meters have the ability to frequently record and transmit energy consumption statistics to both energy providers and consumers by using their communication networks and network management systems. Objectives: First objective of conducted study was providing energy consumption/saving suggestions on smart metering display for accepted consumer behavior, proposed by the energy providers. Our second objective was analysis of financial benefits for the energy provides, which could be expected through better consumer behavior. Third objective was analysis of energy consumption behavior of the residential consumes that how we can support it. Moreover, forth objective of the study was to use extracted suggestions of consumer behaviors to propose Extended Smart Metering Display for improving energy economy. Methods: In this study a background study was conducted to develop basic understanding about District Heat Energy (DHE), smart meters and their existing display, consumer behaviors and its effects on energy consumption. Moreover, interviews were conducted with representatives of smart heat meters’ manufacturer, energy providers and residential consumers. Interviews’ findings enabled us to propose an Extended Smart Metering Display, that satisfies recommendations received from all the interviewees and background study. Further in this study, a workshop was conducted for the evaluation of the proposed Extended Smart Metering Display which involved representatives of smart heat meters’ manufacture and residential energy consumers. DHE providers also contributed in this workshop through their comments in online conversation, for which an evaluation request was sent to member companies of Swedish District Heating Association. Results: Informants in this research have different levels of experiences. Through a systematic procedure we have obtained and analyzed findings from all the informants. To fulfill the energy demands during peak hours, the informants emphasized on providing efficient energy consumption behavior to be displayed on smart heat meters. According to the informants, efficient energy consumption behavior can be presented through energy consumption/saving suggestions on display of smart meters. These suggestions are related to daily life activities like taking bath and shower, cleaning, washing and heating usage. We analyzed that efficient energy consumption behavior recommended by the energy providers can provide financial improvements both for the energy providers and the residential consumers. On the basis of these findings, we proposed Extended Smart Metering Display to present information in simple and interactive way. Furthermore, the proposed Extended Smart Metering Display can also be helpful in measuring consumers’ energy consumption behavior effectively. Conclusions: After obtaining answers of the research questions, we concluded that extension of existing smart heat meters’ display can effectively help the energy providers and the residential consumers to utilize the resources efficiently. That is, it will not only reduce energy bills for the residential consumers, but it will also help the energy provider to save scarce energy and enable them to serve the consumers better in peak hours. After deployment of the proposed Extended Smart Metering Display the energy providers will able to support the consumers’ behavior in a reliable way and the consumers will find/follow the energy consumption/saving guidelines easily.

    Download full text (pdf)
    FULLTEXT01
  • 285.
    Ahmed, Qutub Uddin
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Mujib, Saifullah Bin
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Context Aware Reminder System: Activity Recognition Using Smartphone Accelerometer and Gyroscope Sensors Supporting Context-Based Reminder Systems2014Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context. Reminder system offers flexibility in daily life activities and assists to be independent. The reminder system not only helps reminding daily life activities, but also serves to a great extent for the people who deal with health care issues. For example, a health supervisor who monitors people with different health related problems like people with disabilities or mild dementia. Traditional reminders which are based on a set of defined activities are not enough to address the necessity in a wider context. To make the reminder more flexible, the user’s current activities or contexts are needed to be considered. To recognize user’s current activity, different types of sensors can be used. These sensors are available in Smartphone which can assist in building a more contextual reminder system. Objectives. To make a reminder context based, it is important to identify the context and also user’s activities are needed to be recognized in a particular moment. Keeping this notion in mind, this research aims to understand the relevant context and activities, identify an effective way to recognize user’s three different activities (drinking, walking and jogging) using Smartphone sensors (accelerometer and gyroscope) and propose a model to use the properties of the identification of the activity recognition. Methods. This research combined a survey and interview with an exploratory Smartphone sensor experiment to recognize user’s activity. An online survey was conducted with 29 participants and interviews were held in cooperation with the Karlskrona Municipality. Four elderly people participated in the interview. For the experiment, three different user activity data were collected using Smartphone sensors and analyzed to identify the pattern for different activities. Moreover, a model is proposed to exploit the properties of the activity pattern. The performance of the proposed model was evaluated using machine learning tool, WEKA. Results. Survey and interviews helped to understand the important activities of daily living which can be considered to design the reminder system, how and when it should be used. For instance, most of the participants in the survey are used to using some sort of reminder system, most of them use a Smartphone, and one of the most important tasks they forget is to take their medicine. These findings helped in experiment. However, from the experiment, different patterns have been observed for three different activities. For walking and jogging, the pattern is discrete. On the other hand, for drinking activity, the pattern is complex and sometimes can overlap with other activities or can get noisy. Conclusions. Survey, interviews and the background study provided a set of evidences fostering reminder system based on users’ activity is essential in daily life. A large number of Smartphone users promoted this research to select a Smartphone based on sensors to identify users’ activity which aims to develop an activity based reminder system. The study was to identify the data pattern by applying some simple mathematical calculations in recorded Smartphone sensors (accelerometer and gyroscope) data. The approach evaluated with 99% accuracy in the experimental data. However, the study concluded by proposing a model to use the properties of the identification of the activities and developing a prototype of a reminder system. This study performed preliminary tests on the model, but there is a need for further empirical validation and verification of the model.

    Download full text (pdf)
    FULLTEXT01
  • 286.
    Ahmed, Sabbir
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Performance of Multi-Channel Medium Access Control Protocol incorporating Opportunistic Cooperative Diversity over Rayleigh Fading Channel2006Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This thesis paper proposes a Medium Access Control (MAC) protocol for wireless networks, termed as CD-MMAC that utilizes multiple channels and incorporates opportunistic cooperative diversity dynamically to improve its performance. The IEEE 802.11b standard protocol allows the use of multiple channels available at the physical layer but its MAC protocol is designed only for a single channel. The proposed protocol utilizes multiple channels by using single interface and incorporates opportunistic cooperative diversity by using cross-layer MAC. The new protocol leverages the multi-rate capability of IEEE 802.11b and allows wireless nodes far away from destination node to transmit at a higher rate by using intermediate nodes as a relays. The protocol improves network throughput and packet delivery ratio significantly and reduces packet delay. The performance improvement is further evaluated by simulation and analysis.

    Download full text (pdf)
    FULLTEXT01
  • 287. Ahmed, Sabbir
    et al.
    Casas, Christian Ibar
    Coso, Aitor del
    Mohammed, Abbas
    Performance of Multi-Channel MAC incorporating Opportunistic Cooperative Diversity2007Conference paper (Refereed)
  • 288.
    Ahmed, Shehzad
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Mechanical Engineering.
    Conradt, Marcos H. K.
    Blekinge Institute of Technology, School of Engineering, Department of Mechanical Engineering.
    Pereira, Valeria De Fusco
    Blekinge Institute of Technology, School of Engineering, Department of Mechanical Engineering.
    Alternative Fuels for Transportation: A Sustainability Assessment of Technologies within an International Energy Agency Scenario2009Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Transport sector is an essential driver of economic development and growth, and at the same time, one of the biggest contributors to climate change, responsible for almost a quarter of the global carbon dioxide emissions. The sector is 95 percent dependent on fossil fuels. International Energy Agency (IEA) scenarios present different mixes of fuels to decrease both dependence on fossil fuels and emissions, leading to a more sustainable future. The main alternative fuels proposed in the Blue map scenario, presented in the Energy Technologies Perspective 2008, were hydrogen and second-generation ethanol. An assessment of these fuels was made using the tools SLCA (Sustainability Life Cycle Assessment) and SWOT Analysis. A Framework for Strategic Sustainable Development (FSSD) is the background used to guide the assessment and to help structure the results and conclusions. The results aim to alert the transport sector stakeholders about the sustainability gaps of the scenario, so decisions can be made to lead society towards a sustainable future.

    Download full text (pdf)
    FULLTEXT01
  • 289.
    Ahmed Sheik, Kareem
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    A Comparative Study on Optimization Algorithms and its efficiency2022Independent thesis Advanced level (degree of Master (Two Years)), 20 HE creditsStudent thesis
    Abstract [en]

    Background: In computer science, optimization can be defined as finding the most cost-effective or notable achievable performance under certain circumstances, maximizing desired factors, and minimizing undesirable results. Many problems in the real world are continuous, and it isn't easy to find global solutions. However, computer technological development increases the speed of computations [1]. The optimization method, an efficient numerical simulator, and a realistic depiction of physical operations that we intend to describe and optimize for any optimization issue are all interconnected components of the optimization process [2].

    Objectives: A literature review on existing optimization algorithms is performed. Ten different benchmark functions are considered and are implemented on the existing chosen algorithms like GA (Genetic Algorithm), ACO (Ant ColonyOptimization) Method, and Plant Intelligence Behaviour optimization algorithm to measure the efficiency of these approaches based on the factors or metrics like CPU Time, Optimality, Accuracy, and Mean Best Standard Deviation.

    Methods: In this research work, a mixed-method approach is used. A literature review is performed based on the existing optimization algorithms. On the other hand, an experiment is conducted by using ten different benchmark functions with the current optimization algorithms like PSO algorithm, ACO algorithm, GA, and PIBO to measure their efficiency based on the four different factors like CPU Time, Optimality, Accuracy, Mean Best Standard Deviation. This tells us which optimization algorithms perform better.

    Results: The experiment findings are represented within this section. Using the standard functions on the suggested method and other methods, the various metrics like CPU Time, Optimality, Accuracy, and Mean Best Standard Deviation are considered, and the results are tabulated. Graphs are made using the data obtained.

    Analysis and Discussion: The research questions are addressed based on the experiment's results that have been conducted.

    Conclusion: We finally conclude the research by analyzing the existing optimization methods and the algorithms' performance. The PIBO performs much better and can be depicted from the results of the optimal metrics, best mean, standard deviation, and accuracy, and has a significant drawback of CPU Time where its time taken is much higher when compared to the PSO algorithm and almost close to GA and performs much better than ACO algorithm.

    Download full text (pdf)
    A Comparative Study on Optimization Algorithms and its efficiency
  • 290.
    Ahmed, Soban
    et al.
    Natl Univ Comp & Emerging Sci, PAK.
    Bhatti, Muhammad Tahir
    Natl Univ Comp & Emerging Sci, PAK.
    Khan, Muhammad Gufran
    Natl Univ Comp & Emerging Sci, PAK.
    Lövström, Benny
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mathematics and Natural Sciences.
    Shahid, Muhammad
    Natl Univ Comp & Emerging Sci, PAK.
    Development and Optimization of Deep Learning Models for Weapon Detection in Surveillance Videos2022In: Applied Sciences, E-ISSN 2076-3417, Vol. 12, no 12, article id 5772Article in journal (Refereed)
    Abstract [en]

    Featured Application This work has applied computer vision and deep learning technology to develop a real-time weapon detector system and tested it on different computing devices for large-scale deployment. Weapon detection in CCTV camera surveillance videos is a challenging task and its importance is increasing because of the availability and easy access of weapons in the market. This becomes a big problem when weapons go into the wrong hands and are often misused. Advances in computer vision and object detection are enabling us to detect weapons in live videos without human intervention and, in turn, intelligent decisions can be made to protect people from dangerous situations. In this article, we have developed and presented an improved real-time weapon detection system that shows a higher mean average precision (mAP) score and better inference time performance compared to the previously proposed approaches in the literature. Using a custom weapons dataset, we implemented a state-of-the-art Scaled-YOLOv4 model that resulted in a 92.1 mAP score and frames per second (FPS) of 85.7 on a high-performance GPU (RTX 2080TI). Furthermore, to achieve the benefits of lower latency, higher throughput, and improved privacy, we optimized our model for implementation on a popular edge-computing device (Jetson Nano GPU) with the TensorRT network optimizer. We have also performed a comparative analysis of the previous weapon detector with our presented model using different CPU and GPU machines that fulfill the purpose of this work, making the selection of model and computing device easier for the users for deployment in a real-time scenario. The analysis shows that our presented models result in improved mAP scores on high-performance GPUs (such as RTX 2080TI), as well as on low-cost edge computing GPUs (such as Jetson Nano) for weapon detection in live CCTV camera surveillance videos.

    Download full text (pdf)
    fulltext
  • 291.
    Ahmed, Syed Rizwan
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Secure Software Development: Identification of Security Activities and Their Integration in Software Development Lifecycle2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Today’s software is more vulnerable to attacks due to increase in complexity, connectivity and extensibility. Securing software is usually considered as a post development activity and not much importance is given to it during the development of software. However the amount of loss that organizations have incurred over the years due to security flaws in software has invited researchers to find out better ways of securing software. In the light of research done by many researchers, this thesis presents how software can be secured by considering security in different phases of software development life cycle. A number of security activities have been identified that are needed to build secure software and it is shown that how these security activities are related with the software development activities of the software development lifecycle.

    Download full text (pdf)
    FULLTEXT01
  • 292.
    Ahmed, Syed Saif
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Arepalli, Harshini Devi
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Auto-scaling Prediction using MachineLearning Algorithms: Analysing Performance and Feature Correlation2023Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Despite Covid-19’s drawbacks, it has recently contributed to highlighting the significance of cloud computing. The great majority of enterprises and organisations have shifted to a hybrid mode that enables users or workers to access their work environment from any location. This made it possible for businesses to save on-premises costs by moving their operations to the cloud. It has become essential to allocate resources effectively, especially through predictive auto-scaling. Although many algorithms have been studied regarding predictive auto-scaling, further analysis and validation need to be done. The objectives of this thesis are to implement machine-learning algorithms for predicting auto-scaling and to compare their performance on common grounds. The secondary objective is to find data connections amongst features within the dataset and evaluate their correlation coefficients. The methodology adopted for this thesis is experimentation. The selection of experimentation was made so that the auto-scaling algorithms can be tested in practical situations and compared to the results to identify the best algorithm using the selected metrics. This experiment can assist in determining whether the algorithms operate as predicted. Metrics such as Accuracy, F1-Score, Precision, Recall, Training Time andRoot Mean Square Error(RMSE) are calculated for the chosen algorithms RandomForest(RF), Logistic Regression, Support Vector Machine and Naive Bayes Classifier. The correlation coefficients of the features in the data are also measured, which helped in increasing the accuracy of the machine learning model. In conclusion, the features related to our target variable(CPU us-age, p95_scaling) often had high correlation coefficients compared to other features. The relationships between these variables could potentially be influenced by other variables that are unrelated to the target variable. Also, from the experimentation, it can be seen that the optimal algorithm for determining how cloud resources should be scaled is the Random Forest Classifier.

    Download full text (pdf)
    Auto-scaling Prediction using Machine Learning Algorithms: Analysing Performance and Feature Correlation
  • 293.
    ahmed, Tanveer
    et al.
    Blekinge Institute of Technology, School of Computing.
    Raju, Madhu Sudhana
    Blekinge Institute of Technology, School of Computing.
    Integrating Exploratory Testing In Software Testing Life Cycle, A Controlled Experiment2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context. Software testing is one of the crucial phases in software development life cycle (SDLC). Among the different manual testing methods in software testing, Exploratory testing (ET) uses no predefined test cases to detect defects. Objectives. The main objective of this study is to test the effectiveness of ET in detecting defects at different software test levels. The objective is achieved by formulating hypotheses, which are later tested for acceptance or rejection. Methods. Methods used in this thesis are literature review and experiment. Literature review is conducted to get in-depth knowledge on the topic of ET and to collect data relevant to ET. Experiment was performed to test hypotheses specific to the three different testing levels : unit , integration and system. Results. The experimental results showed that using ET did not find all the seeded defects at the three levels of unit, integration and system testing. The results were analyzed using statistical tests and interpreted with the help of bar graphs. Conclusions. We conclude that more research is required in generalizing the benefits of ET at different test levels. Particularly, a qualitative study to highlight factors responsible for the success and failure of ET is desirable. Also we encourage a replication of this experiment with subjects having a sound technical and domain knowledge.

    Download full text (pdf)
    FULLTEXT01
  • 294.
    Ahmed, Usman
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Cybercrime: A case study of the Menace and Consequences Internet Manipulators2020Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
  • 295.
    Ahmed, Uzair
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Saqib, Muhammad
    Blekinge Institute of Technology, School of Engineering.
    Optimal Solutions Of Fuzzy Relation Equations2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Fuzzy relation equations are becoming extremely important in order to investigate the optimal solution of the inverse problem even though there is a restrictive condition for the availability of the solution of such inverse problems. We discussed the methods for finding the optimal (maximum and minimum) solution of inverse problem of fuzzy relation equation of the form $R \circ Q = T$ where for both cases R and Q are kept unknown interchangeably using different operators (e.g. alpha, sigma etc.). The aim of this study is to make an in-depth finding of best project among the host of projects, depending upon different factors (e.g. capital cost, risk management etc.) in the field of civil engineering. On the way to accomplish this aim, two linguistic variables are introduced to deal with the uncertainty factor which appears in civil engineering problems. Alpha-composition is used to compute the solution of fuzzy relation equation. Then the evaluation of the projects is orchestrated by defuzzifying the obtained results. The importance of adhering to such synopsis, in the field of civil engineering, is demonstrated by an example.

    Download full text (pdf)
    FULLTEXT01
  • 296.
    Ahmed, Zaheer
    et al.
    Blekinge Institute of Technology, School of Computing.
    Shahzad, Aamir
    Blekinge Institute of Technology, School of Computing.
    Mobile Robot Navigation using Gaze Contingent Dynamic Interface2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Using eyes as an input modality for different control environments is a great area of interest for enhancing the bandwidth of human machine interaction and providing interaction functions when the use of hands is not possible. Interface design requirements in such implementations are quite different from conventional application areas. Both command-execution and feedback observation tasks may be performed by human eyes simultaneously. In order to control the motion of a mobile robot by operator gaze interaction, gaze contingent regions in the operator interface are used to execute robot movement commands, with different screen areas controlling specific directions. Dwell time is one of the most established techniques to perform an eye-click analogous to a mouse click. But repeated dwell time while switching between gaze-contingent regions and feedback-regions decreases the performance of the application. We have developed a dynamic gaze-contingent interface in which we merge gaze-contingent regions with feedback-regions dynamically. This technique has two advantages: Firstly it improves the overall performance of the system by eliminating repeated dwell time. Secondly it reduces fatigue of the operator by providing a bigger area to fixate in. The operator can monitor feedback with more ease while sending commands at the same time.

    Download full text (pdf)
    FULLTEXT01
    Download full text (pdf)
    FULLTEXT02
  • 297.
    Ahmed, Zaki
    Blekinge Institute of Technology, School of Engineering.
    Modeling and Simulation of Urea Dosing System2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    To protect our health and environment from pollution, among others regulatory agencies in the European Union (EU) and legislation from the U.S. Environmental Protection Agency (EPA) has required that pollutants produced by diesel engines - such as nitrogen oxides (NOx), hydrocarbons (HC) and particulate matter (PM) - be reduced. The key emission reduction and control technologies available for NOx control on Diesel engines are combination of Exhaust Gas Recirculation (EGR) and Selective Catalytic Reduction (SCR). SCR addresses emission reduction through the use of Diesel Exhuast Fluid (DEF), which has a trade-name AdBlue. Which is 32.5% high purity urea and 67.5% deionized water, Adblue in the hot exhaust gas decomposes into ammonia (NH3) which then reacts with surface of the catalyst to produce harmless nitrogen(N2) and water (H20). Highest NOx conversion ratios while avoiding ammonia slip is achieved by Efficient SCR and accurate Urea Dosing System it’s therefore critical we model and simulate the UDS in order to analyze and gain holistic understanding of the UDS dynamic behavior. The process of Modeling and Simulating of Urea Dosing System is a result of a compromise between two opposing trends. Firstly, one needs to use as much mathematical models as it takes to correctly describe the fundamental principles of fluid dynamics such as, (1) mass is conserved (2), Newton’s second law and (3) energy is conserved, secondly the model needs to be as simple as possible, in order to express a simple and useful picture of real systems. Numerical model for the simulation of Urea Dosing System is implemented in GT Suite® environment, it is complete UDS Model (Hydraulic circuit and Dosing Unit) and it stands out for its ease of use and simulation fastness, The UDS model has been developed and validated using as reference Hilite Airless Dosing System at the ATC Lab, results provided by the model allow to analyze the UDS pump operation, as well the complete system, showing the trend of some important parameters which are difficult to measure such as viscosity, density, Reynolds number and giving plenty of useful information to understand the influence of the main design parameters of the pump, such as volumetric efficiency, speed and flow relations.

    Download full text (pdf)
    FULLTEXT01
  • 298.
    Ahmet, Zeynep
    Blekinge Institute of Technology, School of Computing.
    What Are You Doing And Feeling Right Now?2012Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Understanding and capturing game play experiences of players have been of great interest for some time, both in academia and industry. Methods used for eliciting game play experiences have involved the use of observations, biometric data and post-game techniques such as surveys and interviews. This is true for games that are played in fixed settings, such as computer or video games. Pervasive games however, provide a greater challenge for evaluation, as they are games that typically engage players in outdoor environments, which might mean constant movement and a great deal of the players' motor skills engaged for several hours or days. In this project I explored a new method for eliciting different aspects of the game play experience of pervasive game players, specifically focusing on motional states and different qualities of immersion. I have centered this work on self-reporting as a means for reporting these aspects of the game play experiences. However, this required an approach to selfreporting as non-obtrusive, not taking too much of the players’ attention from the game activities as well as provide ease of use. To understand the challenges in introducing a new method into a gaming experience, I focused my research on understanding experience, which is a subjective concept. Even though there are methods aiming at capturing the physiological changes during game play, they don’t capture players’ interpretations of the gaming situation. By combining this with objective measurements, I was able to gain a comprehensive understanding of the context of use. The resulting designs were two tools, iteratively developed and pre-tested in a tabletop role-playing session before a test run in the pervasive game Interference. From my findings I was able to conclude that using self-reporting tools for players to use while playing was successful, especially as the data derived from the tools supported post-game interviews. There were however challenges regarding the design and functionality, in particular in outdoor environments, that suggests improvements, as well as considerations on the use of selfreporting as an additional method for data collection.

    Download full text (pdf)
    FULLTEXT01
  • 299.
    Ahnell, Fredrik
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Noring, Sebastan
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Inlärningsverktyg för JavaScript - En jämförelse avseende inlärning av grundläggande kunskaper på egen hand2021Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
  • 300.
    Ahnstedt, Linda
    et al.
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Johansson, Susanna
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Basmobilen2003Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Dagens mobiltelefoner får fler och fler funktioner, det är inte längre ovanligt om det går att koppla upp sig mot Internet eller ta kort med mobiltelefonen. Men frågan som ställs i detta arbete är om mobiltelefonanvändarna är intresserade av dessa nya funktioner. Finns det de som är intresserade av vad som i detta arbete kallas basmobilen, vars funktioner är följande: telefonsamtal, SMS, telefonbok, samtalslista, alarm och möjligheten att kunna byta ringsignal. Efter en historisk inblick i mobiltelefonens utveckling ges en beskrivning av vad basmobilen motsvarar. Med hjälp av en enkätundersökning, intervjuer och insamlade dagböcker besvaras frågan vad mobiltelefonanvändarna är intresserade av. Resultatet visar att basmobilen inte är ointressant för enkätbesvararna, men det finns även en del som saknar funktioner. Detta tas sedan upp i diskussionsavsnittet, där det även finns ett förslag på hur mobiltelefoner ska byggas upp för att passa en så bred användargrupp som möjligt.

    Download full text (pdf)
    FULLTEXT01
3456789 251 - 300 of 17072
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf