Change search
Refine search result
17181920212223 951 - 1000 of 13202
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 951.
    Azam, Muhammad
    Blekinge Institute of Technology, School of Engineering.
    Methods for Recovery of Missing Speech Packets2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In packetized voice communication, speech packets are sometimes lost due to data transmission problems, e.g., signal fading, or interfering users and noise. For the recovery of missing speech packets, different methods are proposed. This thesis analyzes some recovery methods, and four variants of a waveform substitution method used during the objective analysis. This method is based on slow varying speech parameter estimates. These parameters include the short time energy (STE) and the zero crossing (ZC) measure. This technique is implemented in two different ways based on the slow varying parameters. These parameters are stored in the previous packet. If a speech packet is lost, it is recovered by the information stored in the previous packets. Both implementations differ only in the use of the zero crossing information. The short time energy estimation is the same in both implementations. A slight modification is made in these two implementations where the estimated speech parameters are stored in the previous and in future packets in order to recover two consecutive packets. This modification is applied only if the speech signal is already saved at the transmitter because it requires the future packets to store the information of previous packets, i.e., a non-causal solution. However, a causal solution is obtained if the signal is allowed to be delayed by one packet. The speech quality of the reconstructed speech signal is analyzed and compared between the four implementations. The implementation of these methods has been validated by subjectively observing the recovered speech packets, and by considering the improvement of the objective measures mean opinion score (MOS), mean square error (MSE) and signal-to-noise ratio (SNR). The recovery of samples within the packets is also discussed. The recovery of samples within a packet is done by the Fast Fourier Transform (FFT) block code method. The FFT block code method is implemented by an iterative algorithm. This method is validated by subjective observations and improvements in objective measures mean square error (MSE) and signal to noise ratio (SNR). The VAD is also used for the waveform substitution method and in the introduction of channel noise. After subjective observations and objective measures, it is concluded that modified method A provides better performance for the recovery of speech packets and the FFT block code method has been validated for recovering the samples within a packet.

  • 952.
    Azam, Muhammad
    et al.
    Blekinge Institute of Technology, School of Computing.
    Ahmad, Luqman
    Blekinge Institute of Technology, School of Computing.
    A Comparative Evaluation of Usability for the iPhone and iPad2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Many everyday systems and products seem to be designed with little regard to usability. This leads to the frustration, wasted time and errors. So the usability of the product is important for its survival in the market. In many previous studies the usability evaluation of the iPhone and iPad carried out individually and very little work has been done on the comparative usability evaluation. However, there was not any study conducted on the comparative usability evaluation and measuring the performance of the iPhone versus iPad in a controlled environment. In this research work, the authors performed the comparative usability evaluation and measured the performances of the iPhone and iPad on the selected applications by considering the young users as well as the elderly users. Another objective of this study is to identify the usability issues in performances of the iPhone and iPad. A survey and experiment techniques were used to achieve the dened objectives. The survey questionnaire consisted of 42 statements that presented the different usability aspects. The objectives of the survey study were to validate the identified issues from the literature study, identify new issues and measure the signicant difference in user opinions for the iPhone and iPad. However, the experiment studies helped to measure the performance significances between the devices against the three user groups (novice user, experienced user, elderly user) and among the groups over the devices. Further, objective was to measure the satisfaction level of the participated users against the iPhone and iPad. The experiment was performed in a controlled environment. Total six tasks (two tasks per application) were dened and each participant performed the same task on both devices. Generally the authors found that the participants performed better on the iPad with lower error rates as compare to the iPhone.

  • 953.
    Azam, Muhammad
    et al.
    Blekinge Institute of Technology, School of Computing.
    Hussain, Izhar
    Blekinge Institute of Technology, School of Computing.
    The Role of Interoperability in eHealth2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In the light of challenges the lack of interoperability in systems and services has long been recognized as one of the major challenge to the wider implementation of the eHealth applications. The opportunities and positive benefits of achieving interoperability are eventually considerable, whereas various barriers and challenges act as impediments. The purpose of this study was to investigate the interoperability among different health care organizations. The knowledge of this study would be supportive to health care organizations to understand the interoperability problems in health care organizations. In the first phase of literature review interoperability challenges in Sweden and other EU countries were identified. On the basis of findings interviews were conducted to know the strategies and planning about interoperability in health care organizations. After analysis of interviews, questionnaires were conducted to know the opinions of different medical IT administrator and health professionals. The authors find after the analysis of interviews and questionnaire that adopting eHealth standard, same system, insuring the security of patient’s health record information and same medical language could be implemented in Sweden and other EU countries health organizations.

  • 954. Azhar, Damir
    et al.
    Riddle, Patricia
    Mendes, Emilia
    Blekinge Institute of Technology, School of Computing.
    Mittas, Nikolaos
    Angelis, Lefteris
    Using ensembles for web effort estimation2013Conference paper (Refereed)
    Abstract [en]

    Background: Despite the number of Web effort estimation techniques investigated, there is no consensus as to which technique produces the most accurate estimates, an issue shared by effort estimation in the general software estimation domain. A previous study in this domain has shown that using ensembles of estimation techniques can be used to address this issue. Aim: The aim of this paper is to investigate whether ensembles of effort estimation techniques will be similarly successful when used on Web project data. Method: The previous study built ensembles using solo effort estimation techniques that were deemed superior. In order to identify these superior techniques two approaches were investigated: The first involved replicating the methodology used in the previous study, while the second approach used the Scott-Knott algorithm. Both approaches were done using the same 90 solo estimation techniques on Web project data from the Tukutuku dataset. The replication identified 16 solo techniques that were deemed superior and were used to build 15 ensembles, while the Scott-Knott algorithm identified 19 superior solo techniques that were used to build two ensembles. Results: The ensembles produced by both approaches performed very well against solo effort estimation techniques. With the replication, the top 12 techniques were all ensembles, with the remaining 3 ensembles falling within the top 17 techniques. These 15 effort estimation ensembles, along with the 2 built by the second approach, were grouped into the best cluster of effort estimation techniques by the Scott-Knott algorithm. Conclusion: While it may not be possible to identify a single best technique, the results suggest that ensembles of estimation techniques consistently perform well even when using Web project data

  • 955.
    Azhar, Muhammad Saad Bin
    et al.
    Blekinge Institute of Technology, School of Computing.
    Aslam, Ammad
    Blekinge Institute of Technology, School of Computing.
    Multiple Coordinated Information Visualization Techniques in Control Room Environment2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Presenting large amount of Multivariate Data is not a simple problem. When there are multiple correlated variables involved, it becomes difficult to comprehend data using traditional ways. Information Visualization techniques provide an interactive way to present and analyze such data. This thesis has been carried out at ABB Corporate Research, Västerås, Sweden. Use of Parallel Coordinates and Multiple Coordinated Views was has been suggested to realize interactive reporting and trending of Multivariate Data for ABB’s Network Manager SCADA system. A prototype was developed and an empirical study was conducted to evaluate the suggested design and test it for usability from an actual industry perspective. With the help of this prototype and the evaluations carried out, we are able to achieve stronger results regarding the effectiveness and efficiency of the visualization techniques used. The results confirm that such interfaces are more effective, efficient and intuitive for filtering and analyzing Multivariate Data.

  • 956.
    Aziz, Faisal
    Blekinge Institute of Technology, School of Management.
    Contextual Analysis of IT-Business Alignment2011Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    IT-Business alignment is an imperative process in the success of any business to use the competitive advantage of the information technology. In this work, influence of contextual information on IT-Business alignment has been studied. For this purpose, a conceptual model has been proposed which considers the IT-Business alignment model and national culture as contextual information. For alignment model, strategic alignment maturity model (SAMM) by Luftmann has been used while Hofstede’s model has been followed for the analysis of national culture. Conceptual model considers the 6 maturity assessment elements and analyzes them in the perspective of 4 national cultural factors. Survey methodology has been employed with questionnaire technique to compute the SAMM elements. National cultural scores have been obtained from the empirical study of Hofstede. Empirical results of SAMM elements and cultural factors have been used to support the analysis of the conceptual model.

  • 957.
    Aziz, Hussein
    Blekinge Institute of Technology, School of Computing.
    Streaming Video over Unreliable and Bandwidth Limited Networks2013Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    The main objective of this thesis is to provide a smooth video playout on the mobile device over wireless networks. The parameters that specify the wireless channel include: bandwidth variation, frame losses, and outage time. These parameters may affect the quality of the video negatively, and the mobile users may notice sudden stops during the playout video, i.e., the picture is momentarily frozen, followed by a jump from one scene to a different one. This thesis focuses on eliminating frozen pictures and reducing the amount of video data that need to be transmitted. In order to eliminate frozen scenes on the mobile screen, we propose three different techniques. In the first technique, the video frames are split into sub-frames; these sub-frames are streamed over different channels. In the second technique the sub-frames will be “crossed” and sent together with other sub-frames that are from different positions in the streaming video sequence. If some sub-frames are lost during the transmission a reconstruction mechanism will be applied on the mobile device to recreate the missing sub-frames. In the third technique, we propose a Time Interleaving Robust Streaming (TIRS) technique to stream the video frames in different order. The benefit of that is to avoid losing a sequence of neighbouring frames. A missing frame from the streaming video will be reconstructed based on the surrounding frames on the mobile device. In order to reduce the amount of video data that are streamed over limited bandwidth channels, we propose two different techniques. These two techniques are based on identifying and extracting a high motion region of the video frames. We call this the Region Of Interest (ROI); the other parts of the video frames are called the non-Region Of Interest (non-ROI). The ROI is transmitted with high quality, whereas the non-ROI is interpolated from a number of references frames. In the first technique the ROI is a fixed size region; we considered four different types of ROI and three different scenarios. The scenarios are based on the position of the reference frames in the streaming frame sequence. In the second technique the ROI is identified based on the motion in the video frames, therefore the size, position, and shape of the ROI will be different from one video to another according to the video characteristic. The videos are coded using ffmpeg to study the effect of the proposed techniques on the encoding size. Subjective and objective metrics are used to measure the quality level of the reconstructed videos that are obtained from the proposed techniques. Mean Opinion Score (MOS) measurements are used as a subjective metric based on human opinions, while for objective metric the Structural Similarity (SSIM) index is used to compare the similarity between the original frames and the reconstructed frames.

  • 958. Aziz, Hussein Muzahim
    Enhancing the Smoothness of Streaming Video for Mobile Users over Unreliable Networks2010Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Real time video streaming over wireless network is an increasingly important and attractive service to the mobile users. Video streaming involves a large amount of data to be transmitted in real time, while wireless channel conditions may vary from time to time. It is hard to guarantee a reliable transmission over the wireless network, where the parameters specifying the transmissions are; bandwidth, packet loss, packet delays, and outage times. The quality of the video is affected negatively when network packets are lost, and the mobile users may notice some sudden stop during the video playing; the picture is momentarily frozen, followed by a jump from one scene to a totally different one. The main objective of this thesis is to provide a smooth video playback in the mobile device over unreliable networks with a satisfactory video quality. Three different techniques are proposed to achieve this goal. The first technique will stream duplicate gray scale frames over multichannels, if there is lost frames in one channel it can be recovered from another channel. In the second technique, each video frame will be split into sub-frames. The splitted sub-frames will be streamed over multichannels. If there is a missing sub-frame during the transmission a reconstruction mechanism will be applied in the mobile device to recreate the missing sub-frames. In the third technique, we propose a time interleaving robust streaming (TIRS) technique to stream the video frames in different order. The benefit of that is to avoid the losses of a sequence of neighbouring frames. A missing frame from the streaming video will be reconstructed based on the surrounding frames. The mean opinion score (MOS) metric is used to evaluate the video quality. The experienced quality of a video is subject to the personal opinion, which is the only goal to satisfy the average human watching the contents of the video.

  • 959.
    Aziz, Hussein Muzahim
    et al.
    Blekinge Institute of Technology, School of Computing.
    Fiedler, Markus
    Blekinge Institute of Technology, School of Computing.
    Grahn, Håkan
    Blekinge Institute of Technology, School of Computing.
    Lundberg, Lars
    Blekinge Institute of Technology, School of Computing.
    Compressing Video Based on Region of Interest2013Conference paper (Refereed)
    Abstract [en]

    Real-time video streaming suffer from bandwidth limitation that are unable to handle the high amount of video data. To reduce the amount of data to be streamed, we propose an adaptive technique to crop the important part of the video frames, and drop the part that are outside the important part; this part is called the Region of Interest (ROI). The Sum of Absolute Differences (SAD) is computed to the consecutive video frames on the server side to identify and extract the ROI. The ROI are extracted from the frames that are between reference frames based on three scenarios. The scenarios been designed to position the reference frames in the video frames sequence. Linear interpolation is performed from the reference frames to reconstruct the part that are outside the ROI on the mobile side. We evaluate the proposed approach for the three scenarios by looking at the size of the compressed videos and measure the quality of the videos by using the Mean Opinion Score (MOS). The results show that our technique significantly reduces the amount of data to be streamed over wireless networks with acceptable video quality are provided to the mobile viewers.

  • 960. Aziz, Hussein Muzahim
    et al.
    Fiedler, Markus
    Blekinge Institute of Technology, School of Computing.
    Grahn, Håkan
    Blekinge Institute of Technology, School of Computing.
    Lundberg, Lars
    Blekinge Institute of Technology, School of Computing.
    Eliminating the Effects of Freezing Frames on User Perceptive by Using a Time Interleaving Technique2012In: Multimedia Systems, ISSN 0942-4962, E-ISSN 1432-1882, Vol. 18, no 3, p. 251-262Article in journal (Refereed)
    Abstract [en]

    Streaming video over a wireless network faces several challenges such as high packet error rates, bandwidth variations, and delays, which could have negative effects on the video streaming and the viewer will perceive a frozen picture for certain durations due to loss of frames. In this study, we propose a Time Interleaving Robust Streaming (TIRS) technique to significantly reduce the frozen video problem and provide a satisfactory quality for the mobile viewer. This is done by reordering the streaming video frames as groups of even and odd frames. The objective of streaming the video in this way is to avoid the losses of a sequence of neighbouring frames in case of a long sequence interruption. We evaluate our approach by using a user panel and mean opinion score (MOS) measurements; where the users observe three levels of frame losses. The results show that our technique significantly improves the smoothness of the video on the mobile device in the presence of frame losses, while the transmitted data are only increased by almost 9% (due to reduced time locality).

  • 961. Aziz, Hussein Muzahim
    et al.
    Fiedler, Markus
    Grahn, Håkan
    Lundberg, Lars
    Streaming Video as Space – Divided Sub-Frames over Wireless Networks2010Conference paper (Refereed)
    Abstract [en]

    Real time video streaming suffers from lost, delayed, and corrupted frames due to the transmission over error prone channels. As an effect of that, the user may notice a frozen picture in their screen. In this work, we propose a technique to eliminate the frozen video and provide a satisfactory quality to the mobile viewer by splitting the video frames into sub- frames. The multiple descriptions coding (MDC) is used to generate multiple bitstreams based on frame splitting and transmitted over multichannels. We evaluate our approach by using mean opinion score (MOS) measurements. MOS is used to evaluate our scenarios where the users observe three levels of frame losses for real time video streaming. The results show that our technique significantly improves the video smoothness on the mobile device in the presence of frame losses during the transmission.

  • 962. Aziz, Hussein Muzahim
    et al.
    Grahn, Håkan
    Lundberg, Lars
    Eliminating the Freezing Frames for the Mobile User over Unreliable Wireless Networks2009Conference paper (Refereed)
    Abstract [en]

    The main challenge of real time video streaming over a wireless network is to provide good quality service (QoS) to the mobile viewer. However, wireless networks have a limited bandwidth that may not be able to handle the continues video frame sequence and also with the possibility that video frames could be dropped or corrupted during the transmission. This could severely affect the video quality. In this study we come up with a mechanism to eliminate the frozen video and provide a quality satisfactory for the mobile viewer. This can be done by splitting the video frames to sub-frame and transmitted over multiple channels. We will present a subjective test, the Mean Opinion Score (MOS). MOS is used to evaluate our scenarios where the users can observe three levels of frame losses for real time video streaming. The results for our technique significantly improves the indicate perceived that video quality.

  • 963. Aziz, Hussein Muzahim
    et al.
    Grahn, Håkan
    Lundberg, Lars
    Sub-Frame Crossing for Streaming Video over Wireless Networks2010Conference paper (Refereed)
    Abstract [en]

    Transmitting a real time video streaming over a wireless network cannot guarantee that all the frames could be received by the mobile devices. The characteristics of a wireless network in terms of the available bandwidth, frame delay, and frame losses cannot be known in advanced. In this work, we propose a new mechanism for streaming video over a wireless channel. The proposed mechanism prevents freezing frames in the mobile devices. This is done by splitting the video frame in two sub-frames and combines them with another sub-frame from different sequence position in the streaming video. In case of lost or dropped frame, there is still a possibility that another half (sub-frame) will be received by the mobile device. The receiving sub-frames will be reconstructed to its original shape. A rate adaptation mechanism will be also highlight in this work. We show that sever can skip up to 50% of the sub-frames and we can still be able to reconstruct the receiving sub-frame and eliminate the freezing picture in the mobile device.

  • 964. Aziz, Hussein Muzahim
    et al.
    Lundberg, Lars
    Graceful degradation of mobile video quality over wireless network2009Conference paper (Refereed)
    Abstract [en]

    Real-time video transmission over wireless channels has become an important topic in wireless communication because of the limited bandwidth of wireless network that should handle high amount of video frames. Video frames must arrive at the client before the playout time with enough time to display the contents of the frames. Real-time video transmission is particularly sensitive to delay as it has a strict bounded end-to-end delay constraint; video applications impose stringent requirements on communication parameters, such as frame lost and frame dropped due to excessive delay are the primary factors affecting the user-perceived quality. In this study we investigate ways of obtaining a graceful and controlled degradation of the quality, by introducing redundancy in the frame sequence and compensating this by limiting colourcoding and resolution. The effect of that is to use double streaming mechanism, in this way we will obtain less freezing at the expense of limited colours and resolution. Our experiments, applied to scenarios where users can observe three types of dropping load for real time video streaming, the analytical measurements tools are used in this study to evaluate the video quality is the mean opinion score and we will demonstrate this and argue that the proposed technique improves the use perceived of the video quality.

  • 965. Aziz, Maryam
    et al.
    Masum, M. E.
    Babu, M. J.
    Rahman, Suhaimi Ab
    Nordberg, Jörgen
    Blekinge Institute of Technology, School of Computing.
    Mobility impact on the end-to-end delay performance for VoIP over LTE2012In: Procedia Engineering, Coimbatore: Elsevier , 2012, Vol. 30, p. 491-498Conference paper (Refereed)
    Abstract [en]

    Long Term Evolution (LTE) is the last step towards the 4th generation of cellular networks. This revolution is necessitated by the unceasing increase in demand for high speed connection on LTE networks. This paper focuses on the performance evaluation of End-to-End delay under variable mobility speed for VoIP (Voice over IP) in the LTE network. In the course of E2E performance evaluation, realizing simulation approach three scenarios have been modeled using OPNET 16.0. The first one is the baseline network while among other two, one consists of VoIP traffic solely and the other consists of FTP along with VoIP. E2E delay has been measured for both scenarios in various cases under the varying mobility speed of the node. Simulation results have been studied and presented in terms of comparative performance analysis of the three network scenarios. In light of the result analysis, the performance quality of a VoIP network (with and without the presence of additional network traffic) in LTE has been determined and discussed. The simulation results for baseline VoIP network (non-congested) congested VoIP network and congested VoIP with FTP network show that as the speed of node is gradually increased, E2E delay slightly increases.

  • 966.
    Aziz, Md. Tariq
    et al.
    Blekinge Institute of Technology, School of Computing.
    Islam, Mohammad Saiful
    Blekinge Institute of Technology, School of Computing.
    Performance Evaluation of Real–Time Applications over DiffServ/MPLS in IPv4/IPv6 Networks2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Over the last years, we have witnessed a rapid deployment of real-time applications on the Internet as well as many research works about Quality of Service (QoS) in particularly IPv4 (Internet Protocol version 4). The inevitable exhaustion of the remaining IPv4 address pool has become progressively evident. As the evolution of Internet Protocol (IP) continues, the deployment of IPv6 QoS is underway. Today, there is limited experience in the deployment of QoS for IPv6 traffic in MPLS backbone networks in conjunction with DiffServ (Differentiated Services) support. DiffServ itself does not have the ability to control the traffic which has been taken for end-to-end path while a number of links of the path are congested. In contrast, MPLS Traffic Engineering (TE) is accomplished to control the traffic and can set up end-to-end routing path before data has been forwarded. From the evolution of IPv4 QoS solutions, we know that the integration of DiffServ and MPLS TE satisfies the guaranteed QoS requirement for real-time applications. This thesis presents a QoS performance study of real-time applications such as voice and video conferencing over DiffServ with or without MPLS TE in IPv4/IPv6 networks using Optimized Network Engineering Tool (OPNET). This thesis also studies the interaction of Expedited Forwarding (EF), Assured Forwarding (AF) traffic aggregation, link congestion, as well as the effect of various performance metrics such as packet end-to-end delay, packet delay variation, queuing delay, throughput and packet loss. The effectiveness of DiffServ and MPLS TE integration in IPv4/IPv6 network is illustrated and analyzed. The thesis shows that IPv6 experiences more delay and loss performance than their IPv4 counterparts.

  • 967.
    Aziz, Reuben
    Blekinge Institute of Technology, School of Management.
    THE REDENOMINATIO OF THE GHANAIAN CURRENCY(2007)- A STUDY OF ITS IMPACT ON THE BUSINESS OF THE FINANCIAL INSTITUTIONS IN GHANA2009Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Abstract The cedi is Ghana’s official currency which was introduced on July 19, 1965. The loss in the value of the cedi since its introduction cannot be overestimated. Owing to the low values of the notes and coins(due to persistent loss in value), huge quantities had to be printed and minted resulting in huge cost to the central bank. Meanwhile commercial banks were facing high cash based transaction cost because relatively large quantities of notes were needed for transactions. There was also high risk involved in cash based transaction to the banks and their clients. Bank customers were becoming more uncomfortable carrying huge quantities of cash to and from the banks. These customers also had to spend more time at the banking halls to get served. These and other factors reduced the interest and confidence of the general public in the financial sector affecting banking businesses. The re denomination of the cedi was done in 2007, to deal with these ’huge dead-weight burden’ on the banks and the entire economy of Ghana. The objective of this study is to explore its impact on some variables affecting commercial banking businesses. These are cost, operational risk, deposit mobilization, ATM operations, and reliability and convenience of banking services. A pluralistic approach was adopted for this research and the results of both the quantitative and qualitative study done in analyzing my hypothesis reaffirmed each other and provided valuable findings and deeper understanding of the impact of the re denomination on the business of the financial institutions. The main findings showed the following: 1) The re denomination had not affected costs in banks, 2) The re denomination has generally reduced operational loss risks, 3) The re denomination has generally improved deposit mobilization, 4) The re denomination has improved the reliability and profitability of ATM s, 5) The re denomination has improved the reliability and convenience of banking services. With this study I hope to provide new insights on how re denomination affects the business of financial institutions who are key partners to the central banks in the successful implementation of such exercise. I also hope to provide valuable recommendations on how banks can deal with the challenges that may be presented by a re denomination

  • 968.
    AZIZ, YASSAR
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    ASLAM, MUHAMMAD NAEEM
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Traffic Engineering with Multi-Protocol Label Switching, Performance Comparison with IP networks2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Traffic Engineering (TE) is the stage which deals with geometric design planning and traffic operation of networks, network devices and relationship of routers for the transportation of data. TE is that feature of network engineering which concentrate on problems of performance optimization of operational networks. It involves techniques and application of knowledge to gain performance objectives, which includes movement of data through network, reliability, planning of network capacity and efficient use of network resources. This thesis addresses the problems of traffic engineering and suggests a solution by using the concept of Multi-Protocol Label Switching (MPLS). We have done simulation in Matlab environment to compare the performance of MPLS against the IP network in a simulated environment. MPLS is a modern technique for forwarding network data. It broadens routing according to path controlling and packet forwarding. In this thesis MPLS is computed on the basis of its performance, efficiency for sending data from source to destination. A MATLAB based simulation tool is developed to compare MPLS with IP network in a simulated environment. The results show the performance of MPLS network in comparison of IP network.

  • 969.
    Azizi, Rokneddin
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Amiryousefi, Hossein
    Blekinge Institute of Technology, School of Engineering.
    Estimating of Work Hardening in Bent Sheet Metal Products at an Early Stage of Virtual Product Development2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The thesis has two aims: firstly, to investigate the influence of work hardening due to forming simulation on test simulation. The investigated question was also: Is it worth the effort to include work hardening effect into test simulation? The second aim was, to propose a quicker method to estimate work hardening without forming simulation. The first aim was investigated by two methods. In Method A, forming simulation was carried out on the steel blank sheet and followed by test simulation. In Method B, test was simulated by using just the CAD geometry of the same component with virgin properties. The 2D and 3D test simulation were performed and it was found that result of method A is about 30% higher than method B, which suggests that it is essential to include work hardening. The problem is that at an early stage of product development the data of the forming process is not available. To solve this problem in the thesis an approach is proposed based on estimation of the hardening effect using analytical approach to calculate the strain just by geometrical parameters. This calculated so called postulated strain was then transferred to test simulation (Method C). The validation of results of test simulation with method C against ones obtained from Method A showed that they were almost identical. The results of the study showed that the proposed quick method to include forming history in test simulation can be used instead of very complex and time consuming FEM simulations of forming process.

  • 970.
    Azuma, Chieko
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    Coletinha, Elvio
    Blekinge Institute of Technology, Faculty of Engineering, Department of Strategic Sustainable Development.
    Villoch, Pablo
    An Exploratory Journey into Sustainability Changemakers Learning Programs2010Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Humanity is facing highly complex challenges at a global scale. A new sort of conscious sustainability changemakers is needed to face the sustainability challenge. However the mainstream entrepreneurship education tends to focus on business as usual skills, with a significant lack of comprehensive understanding of the whole system and the inner work needed to face the mental barriers to become sustainability changemakers. While the Framework for Strategic Sustainable Development was used as a structured approach to the topic, the research design was based on a dynamic research interactive model. Theory U guided the data gathering process that included participatory observation, dialogues with the organizers and participants through the seven progressive schools in Europe. The research aims to identify the common assumptions that guide the design of leading edge learning programs for sustainability changemakers. Building on the findings, the authors present a prototype of a learning tool in a form of self-reflection card game with the intention of helping the next generation of changemakers in their learning journey towards sustainability. Conclusions detail specific guidelines to design a learning program of changemakers towards sustainability.

  • 971.
    Baan, Christopher
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Long, Phil
    Blekinge Institute of Technology, School of Engineering.
    Pearlman, Dana
    Blekinge Institute of Technology, School of Engineering.
    Cultivating personal leadership capacities to facilitate collaboration in Strategic Sustainable Development2011Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The complex, multi-faceted sustainability challenge that society faces calls for a strategic approach to sustainable development. Strategic planning processes towards sustainability in organisations and communities are oftentimes led by a facilitator or facilitative leader. We argue that planning processes of complex and transformational change, call for collaboration among stakeholders and for highly skilled facilitative leaders who are committed to the development of self, others and society. This thesis explores the ‘interior state’ of facilitative leaders as a high leverage point in moving society towards sustainability. We identify nine personal capacities that enable leaders to facilitate collaboration in Strategic Sustainable Development: (1) Being Present, (2) Whole Self-Awareness, (3) Suspension & Letting Go, (4) Compassion, (5) Intention Aligned with Higher Purpose, (6) Whole System Awareness, (7) Personal Power, (8) Sense of Humour, and (9) Holding Dualities and Paradoxes. We identify a range of personal and collective practices that help develop these personal capacities. We propose these capacities are the foundation for a more holistic and authentic facilitation approach applied to strategic sustainable development.

  • 972.
    Babaeeghazvini, Parinaz
    Blekinge Institute of Technology, School of Engineering.
    EEG enhancement for EEG source localization in brain-machine speller2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    A Brain-Computer Interface (BCI) is a system to communicate with external world through the brain activity. The brain activity is measured by Electro-Encephalography (EEG) and then processed by a BCI system. EEG source reconstruction could be a way to improve the accuracy of EEG classification in EEGbased brain–computer interface (BCI). In this thesis BCI methods were applied on derived sources which by their EEG enhancement it became possible to obtain a more accurate EEG detection and brought a new application to BCI technology that are recognition of writing letters imagery from brain waves. The BCI system enables people to write and type letters by their brain activity (EEG). To this end, first part of the thesis is dedicated to EEG source reconstruction techniques to select the most optimal EEG channels for task classification purposes. Due to this reason the changes in EEG signal power from rest state to motor imagery task was used, to find the location of an active single equivalent dipole. Implementing an inverse problem solution on the power changes by Multiple Sparse Priors (MSP) method generated a scalp map where its fitting showed the localization of EEG electrodes. Having the optimized locations the secondary objective was to choose the most optimal EEG features and rhythm for an efficient classification. This became possible by feature ranking, 1- Nearest Neighbor leave-one-out. The feature vectors were computed by applying the combined methods of multitaper method, Pwelch. The features were classified by several methods of Normal densities based quadratic classifier (qdc), k-nearest neighbor classifier (knn), Mixture of Gaussians classification and Train neural network classifier using back-propagation. Results show that the selected features and classifiers are able to recognize the imagination of writing alphabet with the high accuracy.

  • 973.
    Babar, Shahzad
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mehmood, Aamer
    Blekinge Institute of Technology, School of Computing.
    Enhancing Accessibility of Web Based GIS Applications through User Centered Design2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Web Accessibility emerged as problem when disabled and elder people started interaction with web contents soon after the inception of World Wide Web. When web based GIS applications appeared on the scene of web and users of these kinds of applications increased, these applications faced the similar problem of accessibility. The intensity of web accessibility problems in GIS based applications has increased rapidly during recent years due to extensive interaction of user with maps. Web Accessibility problems faced by users of GIS applications are identified by content evaluation and user interaction. Users are involved in identification of accessibility problems because guidelines and automated tools are not sufficient for that purpose. User Centered Approach is used to include users in the development process and this has also helped in identification of accessibility problems of the users at early stages. The thesis report identify the accessibility issues in Web based GIS application by content evaluation and user interaction evaluation. MapQuest, a web based GIS application, is taken as a case study to identify the web accessibility problems in GIS applications. This thesis report has also studied that how accessibility of the web based GIS applications can be enhanced by using UCD approach in development process of GIS applications.

  • 974. Baca, Dejan
    Automated static code analysis: A tool for early vulnerability detection2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software vulnerabilities are added into programs during its development. Architectural flaws are introduced during planning and design, while implementation faults are created during coding. Penetration testing is often used to detect these vulnerabilities. This approach is expensive because it is performed late in development and any correction would increase lead-time. An alternative would be to detect and correct vulnerabilities in the phase of development where they are the least expensive to correct and detect. Source code audits have often been suggested and used to detect implementations vulnerabilities. However, manual audits are time consuming and require extended expertise to be efficient. A static code analysis tool could achieve the same results as a manual audit but at fraction of the time. Through a set of cases studies and experiments at Ericsson AB, this thesis investigates the technical capabilities and limitations of using a static analysis tool as an early vulnerability detector. The investigation is extended to studying the human factor by examining how the developers interact and use the static analysis tool. The contributions of this thesis include the identification of the tools capabilities so that further security improvements can focus on other types of vulnerabilities. By using static analysis early in development possible cost saving measures are identified. Additionally, the thesis presents the limitations of static code analysis. The most important limitation being the incorrect warnings that are reported by static analysis tools. In addition, a development process overhead was deemed necessary to successfully use static analysis in an industry setting.

  • 975.
    Baca, Dejan
    Blekinge Institute of Technology, School of Computing.
    Developing Secure Software: in an Agile Process2012Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Background: Software developers are facing increased pressure to lower development time, release new software versions more frequent to customers and to adapt to a faster market. This new environment forces developers and companies to move from a plan based waterfall development process to a flexible agile process. By minimizing the pre development planning and instead increasing the communication between customers and developers, the agile process tries to create a new, more flexible way of working. This new way of working allows developers to focus their efforts on the features that customers want. With increased connectability and the faster feature release, the security of the software product is stressed. To develop secure software, many companies use security engineering processes that are plan heavy and inflexible. These two approaches are each others opposites and they directly contradict each other. Objective: The objective of the thesis is to evaluate how to develop secure software in an agile process. In particular, what existing best practices can be incorporated into an agile project and still provide the same benefit if the project was using a waterfall process. How the best practices can be incorporated and adapted to fit the process while still measuring the improvement. Some security engineering concepts are useful but the best practice is not agile compatible and would require extensive adaptation to integrate with an agile project. Method: The primary research method used throughout the thesis is case studies conducted in a real industry setting. As secondary methods for data collection a variety of approaches have been used, such as semi-structured interviews, workshops, study of literature, and use of historical data from the industry. Results: The security engineering best practices were investigated though a series of case studies. The base agile and security engineering compatibility was assessed in literature, by developers and in practical studies. The security engineering best practices were group based on their purpose and their compatibility with the agile process. One well known and popular best practice, automated static code analysis, was toughly investigated for its usefulness, deployment and risks of using as part of the process. For the risk analysis practices, a novel approach was introduced and improved. As such, a way of adapting existing practices to agile is proposed. Conclusion: With regard of agile and security engineering we did not find that any of the investigated processes was agile compatible. Agile is reaction driven that adapts to change, while the security engineering processes are proactive and try to prevent threats before they happen. To develop secure software in an agile process the developers should adopt and adapt key concepts from security engineering. These changes will affect the flexibility of the agile process but it is a necessity if developers want the same software security state as security engineering processes can provide.

  • 976. Baca, Dejan
    Identifying Security Relevant Warnings from Static Code Analysis Tools through Code Tainting2010Conference paper (Refereed)
    Abstract [en]

    Static code analysis tools are often used by developers as early vulnerability detectors. Due to their automation they are less time-consuming and error-prone then manual reviews. However, they produce large quantities of warnings that developers have to manually examine and understand. In this paper, we look at a solution that makes static code analysis tools more useful as an early vulnerability detector. We use flow-sensitive, interprocedural and context-sensitive data flow analysis to determine the point of user input and its migration through the source code to the actual exploit. By determining a vulnerabilities point of entry we lower the number of warnings a tool produces and we provide the developer with more information why this warning could be a real security threat. We use our approach in three different ways depending on what tool we examined. First,With the commercial static code analysis tool, Coverity, we reanalyze its results and create a set of warnings that are specifically relevant from a security perspective. Secondly, we altered the open source analysis tool Findbugs to only analyze code that has been tainted by user input. Third, we created an own analysis tool that focuses on XSS vulnerabilities in Java code.

  • 977. Baca, Dejan
    et al.
    Boldt, Martin
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Carlsson, Bengt
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Jacobsson, Andreas
    A Novel Security-Enhanced Agile Software Development Process Applied in an Industrial Setting2015In: Proceedings 10th International Conference on Availability, Reliability and Security ARES 2015, IEEE Computer Society Digital Library, 2015Conference paper (Refereed)
    Abstract [en]

    A security-enhanced agile software development process, SEAP, is introduced in the development of a mobile money transfer system at Ericsson Corp. A specific characteristic of SEAP is that it includes a security group consisting of four different competences, i.e., security manager, security architect, security master and penetration tester. Another significant feature of SEAP is an integrated risk analysis process. In analyzing risks in the development of the mobile money transfer system, a general finding was that SEAP either solves risks that were previously postponed or solves a larger proportion of the risks in a timely manner. The previous software development process, i.e., the baseline process of the comparison outlined in this paper, required 2.7 employee hours spent for every risk identified in the analysis process compared to, on the average, 1.5 hours for the SEAP. The baseline development process left 50% of the risks unattended in the software version being developed, while SEAP reduced that figure to 22%. Furthermore, SEAP increased the proportion of risks that were corrected from 12.5% to 67.1%, i.e., more than a five times increment. This is important, since an early correction may avoid severe attacks in the future. The security competence in SEAP accounts for 5% of the personnel cost in the mobile money transfer system project. As a comparison, the corresponding figure, i.e., for security, was 1% in the previous development process.

  • 978. Baca, Dejan
    et al.
    Carlsson, Bengt
    Agile development with security engineering activities2011Conference paper (Refereed)
    Abstract [en]

    Agile software development has been used by industry to create a more flexible and lean software development process, i.e making it possible to develop software at a faster rate and with more agility during development. There are however concerns that the higher development pace and lack of documentation are creating less secure software. We have therefore looked at three known Security Engineering processes, Microsoft SDL, Cigatel touchpoints and Common Criteria and identified what specific security activities they performed. We then compared these activities with an Agile development process that is used in industry. Developers, from a large telecommunication manufacturer, were interviewed to learn their impressions on using these security activities in an agile development process. We produced a security enhanced Agile development process that we present in this paper. This new Agile process use activities from already established security engineering processes that provide the benefit the developers wanted but did not hinder or obstruct the Agile process in a significant way.

  • 979. Baca, Dejan
    et al.
    Carlsson, Bengt
    Lundberg, Lars
    Evaluating the Cost Reduction of Static Code Analysis for Software Security2008Conference paper (Refereed)
    Abstract [en]

    Automated static code analysis is an efficient technique to increase the quality of software during early development. This paper presents a case study in which mature software with known vul-nerabilities is subjected to a static analysis tool. The value of the tool is estimated based on reported failures from customers. An average of 17% cost savings would have been possible if the static analysis tool was used. The tool also had a 30% success rate in detecting known vulnerabilities and at the same time found 59 new vulnerabilities in the three examined products.

  • 980.
    Baca, Dejan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Carlsson, Bengt
    Blekinge Institute of Technology, School of Computing.
    Petersen, Kai
    Blekinge Institute of Technology, School of Computing.
    Lundberg, Lars
    Blekinge Institute of Technology, School of Computing.
    Improving software security with static automated code analysis in an industry setting2013In: Software, practice & experience, ISSN 0038-0644, E-ISSN 1097-024X, Vol. 43, no 3, p. 259-279Article in journal (Refereed)
    Abstract [en]

    Software security can be improved by identifying and correcting vulnerabilities. In order to reduce the cost of rework, vulnerabilities should be detected as early and efficiently as possible. Static automated code analysis is an approach for early detection. So far, only few empirical studies have been conducted in an industrial context to evaluate static automated code analysis. A case study was conducted to evaluate static code analysis in industry focusing on defect detection capability, deployment, and usage of static automated code analysis with a focus on software security. We identified that the tool was capable of detecting memory related vulnerabilities, but few vulnerabilities of other types. The deployment of the tool played an important role in its success as an early vulnerability detector, but also the developers perception of the tools merit. Classifying the warnings from the tool was harder for the developers than to correct them. The correction of false positives in some cases created new vulnerabilities in previously safe code. With regard to defect detection ability, we conclude that static code analysis is able to identify vulnerabilities in different categories. In terms of deployment, we conclude that the tool should be integrated with bug reporting systems, and developers need to share the responsibility for classifying and reporting warnings. With regard to tool usage by developers, we propose to use multiple persons (at least two) in classifying a warning. The same goes for making the decision of how to act based on the warning.

  • 981.
    Baca, Dejan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Petersen, Kai
    Blekinge Institute of Technology, School of Computing.
    Countermeasure graphs for software security risk assessment: An action research2013In: Journal of Systems and Software, ISSN 0164-1212, Vol. 86, no 9, p. 2411-2428Article in journal (Refereed)
    Abstract [en]

    Software security risk analysis is an important part of improving software quality. In previous research we proposed countermeasure graphs (CGs), an approach to conduct risk analysis, combining the ideas of different risk analysis approaches. The approach was designed for reuse and easy evolvability to support agile software development. CGs have not been evaluated in industry practice in agile software development. In this research we evaluate the ability of CGs to support practitioners in identifying the most critical threats and countermeasures. The research method used is participatory action research where CGs were evaluated in a series of risk analyses on four different telecom products. With Peltier (used prior to the use of CGs at the company) the practitioners identified attacks with low to medium risk level. CGs allowed practitioners to identify more serious risks (in the first iteration 1 serious threat, 5 high risk threats, and 11 medium threats). The need for tool support was identified very early, tool support allowed the practitioners to play through scenarios of which countermeasures to implement, and supported reuse. The results indicate that CGs support practitioners in identifying high risk security threats, work well in an agile software development context, and are cost-effective.

  • 982. Baca, Dejan
    et al.
    Petersen, Kai
    Prioritizing Countermeasures through the Countermeasure Method for Software Security (CM-Sec)2010Conference paper (Refereed)
    Abstract [en]

    Software security is an important quality aspect of a software system. Therefore, it is important to integrate software security touch points throughout the development life-cycle. So far, the focus of touch points in the early phases has been on the identification of threats and attacks. In this paper we propose a novel method focusing on the end product by prioritizing countermeasures. The method provides an extension to attack trees and a process for identification and prioritization of countermeasures. The approach has been applied on an open-source application and showed that countermeasures could be identified. Furthermore, an analysis of the effectiveness and cost-efficiency of the countermeasures could be provided.

  • 983. Baca, Dejan
    et al.
    Petersen, Kai
    Carlsson, Bengt
    Lundberg, Lars
    Static Code Analysis to Detect Software Security Vulnerabilities: Does Experience Matter?2009Conference paper (Refereed)
    Abstract [en]

    Code reviews with static analysis tools are today recommended by several security development processes. Developers are expected to use the tools' output to detect the security threats they themselves have introduced in the source code. This approach assumes that all developers can correctly identify a warning from a static analysis tool (SAT) as a security threat that needs to be corrected. We have conducted an industry experiment with a state of the art static analysis tool and real vulnerabilities. We have found that average developers do not correctly identify the security warnings and only developers with specific experiences are better than chance in detecting the security vulnerabilities. Specific SAT experience more than doubled the number of correct answers and a combination of security experience and SAT experience almost tripled the number of correct security answers.

  • 984.
    Bachu, Rajesh
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    A framework to migrate and replicate VMware Virtual Machines to Amazon Elastic Compute Cloud: Performance comparison between on premise and the migrated Virtual Machine2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context Cloud Computing is the new trend in the IT industry. Traditionally obtaining servers was quiet time consuming for companies. The whole process of research on what kind of hardware to buy, get budget approval, purchase the hardware and get access to the servers could take weeks or months. In order to save time and reduce expenses, most companies are moving towards the cloud. One of the known cloud providers is Amazon Elastic Compute Cloud (EC2). Amazon EC2 makes it easy for companies to obtain virtual servers (known as computer instances) in a cloud quickly and inexpensively. Another advantage of using Amazon EC2 is the flexibility that they offer, so the companies can even import/export the Virtual Machines (VM) that they have built which meets the companies IT security, configuration, management and compliance requirements into Amazon EC2.

    Objectives In this thesis, we investigate importing a VM running on VMware into Amazon EC2. In addition, we make a performance comparison between a VM running on VMware and the VM with same image running on Amazon EC2.

    Methods A Case study research has been done to select a persistent method to migrate VMware VMs to Amazon EC2. In addition an experimental research is conducted to measure the performance of Virtual Machine running on VMware and compare it with same Virtual Machine running on EC2. We measure the performance in terms of CPU, memory utilization as well as disk read/write speed using well-known open-source benchmarks from Phoronix Test Suite (PTS).

    Results Investigation on importing VM snapshots (VMDK, VHD and RAW format) to EC2 was done using three methods provided by AWS. Comparison of performance was done by running each benchmark for 25 times on each Virtual Machine.

    Conclusions Importing VM to EC2 was successful only with RAW format and replication was not successful as AWS installs some software and drivers while importing the VM to EC2. Migrated EC2 VM performs better than on premise VMware VM in terms of CPU, memory utilization and disk read/write speed.

  • 985.
    Backlund, Malin
    et al.
    Blekinge Institute of Technology, Faculty of Health Sciences, Department of Health.
    Fransson, Caroline
    Blekinge Institute of Technology, Faculty of Health Sciences, Department of Health.
    Kvinnors upplevelser av vardagen i det icke akuta skedet efter en hjärtinfarkt - En litteraturstudie2015Student thesis
    Abstract [sv]

    Bakgrund: Hjärtinfarkt är en av de vanligaste dödsorsakerna i världen bland kvinnor. Det finns faktorer som visar på att kvinnor har lättare att drabbas av både psykisk och fysisk ohälsa efter en hjärtinfarkt. En orsak skulle kunna vara att en del kvinnor upplever diffusa symtom vid insjuknandet, vilket kanske skulle kunna leda till rädsla att inte kunna tolka om en ny hjärtinfarkt skulle uppstå. En annan orsak till varför kvinnor kan uppleva ohälsa efter en hjärtinfarkt skulle kunna vara det högre krav som ställs på kvinnor idag. Då de ofta både arbetar heltid och samtidigt sköter hemmet, skulle det kunna leda till att kvinnor kan uppleva försämrad livskvalitet då de inte längre klarar av att nå upp till de krav som samhället ställer på dem. Syfte: Att belysa kvinnors upplevelser av vardagen i det icke akuta skedet efter en hjärtinfarkt. Metod: Litteraturstudie som baserades på sju vetenskapliga artiklar som redogjorde för kvalitativ ansats. Vid kvalitetsgranskningen av de vetenskapliga artiklarna användes Olsson och Sörensens bedömningsmall för kvalitativa studier. För att analysera de vetenskapliga artiklarna användes Granheim och Lundmans tolkning av analysmetod. Resultat: I studiens resultat framkom det att kvinnorna upplevde en ändrad vardag efter hjärtinfarkten där livsstilsförändringar upplevdes svåra att utföra och bibehålla trots att de kände motivation. Kvinnorna upplevde även olika begränsningar, vilket utgav sig i känslomässiga reaktioner. Något annat som kvinnorna belyste var känslan av att ha varit nära döden, där de både kände rädsla över hur skört livet är och tacksamhet över att vara vid liv. Slutsats: Kvinnorna hade svårigheter med att anpassa sig till livet efter hjärtinfarkten. Med uteblivet stöd och otillräcklig kunskap blev det svårt för kvinnorna att hantera den nya livssituationen, vilket resulterade i både psykisk och fysisk ohälsa. Även om de var tacksamma över att ha överlevt upplevde kvinnorna svårigheter med att hantera det nya livet. Med större kunskap om kvinnors upplevelse i det icke akuta skedet efter en hjärtinfarkt skulle sjuksköterskan kunna tillämpa en bättre omvårdnad i ett tidigare skede.

  • 986.
    Backman, Elin
    et al.
    Blekinge Institute of Technology, School of Management.
    Ronnerhall, Josefine
    Blekinge Institute of Technology, School of Management.
    Finansieringsresonemang: en studie av fyra småföretag2012Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Syfte: Syftet med denna uppsats är att undersöka hur småföretag väljer mellan olika finansieringsformer. Metod: För att uppfylla uppsatsens syfte så har vi tillämpat en kvalitativ metod, där en studie genomförts genom semistrukturerade intervjuer med företagare i fyra utvalda småföretag. Slutsatser: Denna undersökning visar på att småföretag bland olika finansieringsalternativ har mest kunskap om bankfinansiering, varför metoden känns trygg och således ofta tillämpas av småföretagare. Kunskapen om andra finansieringsalternativ är bristfällig, och det anses för tid- och kostnadskrävande att skaffa sig kunskap tillräcklig djup nog för att kunna förankra finansieringsbeslut i.

  • 987. Backman, Mikaela
    et al.
    Karlsson, Charlie
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Determinants of self-employment among commuters and non-commuters2016In: Papers in regional science (Print), ISSN 1056-8190, E-ISSN 1435-5957, Vol. 95, no 4, p. 755-Article in journal (Refereed)
    Abstract [en]

    We analyse the determinants of self-employment and focus on the contextual environment. By distinguishing between commuters and non-commuters we are able to analyse the influence from the work and home environment, respectively. Our results indicate a significant difference between non-commuters and commuters in terms of the role of networks for becoming self-employed. Our results indicate that it is the business networks where people work, rather than where they live that exerts a positive influence on the probability of becoming self-employed. These effects are further robust over educational and occupational categories. © 2015 RSAI.

  • 988.
    Backman, Mikaela
    et al.
    Jönköping International Buiness School, SWE.
    Karlsson, Charlie
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Entrepreneurship and Age Across Time and Space2017In: Tijdschrift voor economische en sociale geografie, ISSN 0040-747X, E-ISSN 1467-9663Article in journal (Refereed)
    Abstract [en]

    Studies confirm an inverted U-shaped relationship between age and entrepreneurship. This paper deepens the understanding of this relationship by analysing how the relationship varies across time and across different types of regions, aspects often overlooked in the current literature. An individual perspective is taken, and the probability of starting a firm is expected to increase as individuals' age but at a decreasing rate. The results show significant differences in the relationship between the age of individuals and the rate of entrepreneurship across time and space. The age-entrepreneurship profile has shifted to the left over time such that individuals are younger when they start firms. © 2017 Royal Dutch Geographical Society KNAG.

  • 989.
    Backman, Mikaela
    et al.
    Jonkoping Int Business Sch, SWE.
    Karlsson, Charlie
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Location of New Firms: Influence of Commuting Behaviour2017In: Growth and Change, ISSN 0017-4815, E-ISSN 1468-2257, Vol. 48, no 4, p. 682-699Article in journal (Refereed)
    Abstract [en]

    In the entrepreneurship literature, it is generally assumed that an individual establishes a new firm in a location in which they have strong ties, normally in the municipality of residence or employment. We scrutinise this general assumption and show that firm location depends on individual characteristics, such as the commuting experience. Our results show that commuting influences the firm location choice. The probability of establishing a firm in the work municipality increases if the entrepreneur is a commuter, holding constant the type of region and unobservable and observable individual features.

  • 990.
    Backström, Eva
    et al.
    Blekinge Institute of Technology, School of Health Science.
    Petersson, Martin
    Blekinge Institute of Technology, School of Health Science.
    Kommunikationens betydelse i mötet mellan sjuksköterska och en patient med diagnosen stroke2007Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Varje år insjuknar ca 25-30 000 människor i stroke i Sverige. Sjuksköterskan måste ha i beaktande att dessa patienter kan ha kommunikationssvårigheter av olika grad. Syftet med studien var att undersöka och beskriva hur sjuksköterskan ska gå till väga när hon ska kommunicera med en patient som har fått diagnosen stroke. Resultatet visar att sjuk-sköterskorna ofta saknar kunskap i hur hon ska agera och bete sig när hon kommunicerar med en patient som fått diagnosen stroke. Brist på tid är också ett vanligt förekommande problem som gör att sjuksköterskan helt enkelt inte hinner lägga den tid som krävs för att kommunicera med den individ som insjuknat i stroke. Till följd av det känner sjuksköterskan att hon inte kan sköta sitt jobb på ett korrekt och tillfredsställande sätt. Slutsatsen av vår studie visar att det bl.a. behöver läggas mer tid på att utveckla sjuksköterskornas kommunikativa kunskaper så att en god och trygg relation kan skapas med den som insjuknat i stroke. Studien är en litteraturstudie med kvalitativ utgångspunkt och Joyce Travelbee´s omvårdnadsteori har använts som teoretisk referensram. Travelbee påtalar vikten av kommunikationen mellan sjuksköterska och patient. Polit & Hunglers modell för litteratursökning har använts och vi har blivit inspirerade av Graneheim & Lundmans sammanställning av analysmetod när vi har analyserat vetenskapliga artiklar.

  • 991.
    Bacou, Patrick
    et al.
    Blekinge Institute of Technology, School of Health Science.
    Lönnberg, Oskar
    Blekinge Institute of Technology, School of Health Science.
    Att vårda patienter med HIV/AIDS: Sjuksköterskors upplevelser2012Student thesis
    Abstract [sv]

    Bakgrund: Human Immunodeficiency virus (HIV) och Acquired Immunodeficiency Syndrome (AIDS) är en obotlig sjukdom där prevalensen har ökat de senaste åren. Sjukdomen angriper immunförsvarets celler och utan behandling leder den till en säker död. Sjukdomen sprider rädsla och är förknippad med stigmatisering och diskriminering runt om i världen. Studier visade att patienter kände sig diskriminerade av vårdpersonal, samtidigt som andra studier visade att sjuksköterskor upplevde sig rädda och hade en ovilja att vårda. I en god vård och vårdrelation skall sjuksköterskor värna om patientens välbefinnande. Syfte: Syftet var att beskriva sjuksköterskors upplevelser av att vårda patienter med HIV/AIDS. Metod: Studien baserades på en kvalitativ litteraturstudie av nio artiklar. Resultat: Resultatet delades in i tre kategorier och sju underkategorier. Kategorierna som framkom i resultatet var; att vara känslomässigt involverad, erfarenhet ökade förståelsen och rädsla för smitta. I studien framkom det olika upplevelser som påverkade vårdrelationen både negativt och positivt. Slutsats: Sjuksköterskorna upplevde ett starkt engagemang i vårdandet av patienter med HIV/AIDS. Det var även viktigt att bilda en god relation till patienterna för att möjliggöra en god vård. Den nära relationen kunde ge upphov till lidande då patienter blev svårt sjuka eller avled. Sjuksköterskor upplevde även att erfarenhet ökade deras kunskap och attityd mot sjukdomen. Upplevelser av rädsla framkom och gav upphov till sämre vård.

  • 992. Badampudi, Deepika
    Decision-making support for choosing among different component origins.2018Doctoral thesis, comprehensive summary (Other academic)
  • 993.
    Badampudi, Deepika
    Blekinge Institute of Technology, School of Computing.
    Factors Affecting Efficiency of Agile Planning: A Case Study2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: Planning in software projects is a difficult problem due to the uncertainty associated with it. There are many factors that cause difficulty in formulating a plan. Not many factors that influence the efficiency of planning are identified in the previous studies. The literature focuses only on technical aspects such as requirements selection and estimation in order to plan a release or iteration. Objectives. The objective of this study is to identify factors that affect planning efficiency. The context in which the objective is achieved is large scale complex projects that are distributed across multiple teams, in multiple global sites. The motivation for selecting large scale context is because most of the existing releases planning approaches discussed in the literature were investigated in small scale projects. Hence this context will allow studying the planning process in large scale industry. Methods. A case study was conducted at Siemens’ Development Centre in Bangalore, India. A total of 15 interviews were conducted to investigate the planning process adopted by Siemens. To achieve triangulation, process documents such as release planning documents are studied and direct observation of the planning meeting is performed. Therefore multiple sources are used to collect evidences. Results. The identified challenges are grouped into technical and non-technical category. In total 9 technical factors and 11 non-technical factors are identified. The identified factors are also classified based on the context in which they affect the planning. In addition 6 effects of the factors are identified and improvements perceived by the participants are discussed in this study.

  • 994.
    Badampudi, Deepika
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Reporting Ethics Considerations in Software Engineering Publications2017In: 11TH ACM/IEEE INTERNATIONAL SYMPOSIUM ON EMPIRICAL SOFTWARE ENGINEERING AND MEASUREMENT (ESEM 2017), IEEE , 2017, p. 205-210Conference paper (Refereed)
    Abstract [en]

    Ethical guidelines of software engineering journals require authors to provide statements related to the conflict of interest and the process of obtaining consent (if human subjects are involved). The objective of this study is to review the reporting of the ethical considerations in Empirical Software Engineering - An International Journal. The results indicate that two out of seven studies reported some ethical information however, not explicitly. The ethical discussions were focussed on anonymity and confidentiality. Ethical aspects such as competence, comprehensibility and vulnerability of the subjects were not discussed in any of the papers reviewed in this study. It is important to not only state that consent was obtained however, the procedure of obtaining consent should be reported to improve the accountability and trust.

  • 995.
    Badampudi, Deepika
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Towards decision-making to choose among different component origins2016Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Context: The amount of software in solutions provided in various domains is continuously growing. These solutions are a mix of hardware and software solutions, often referred to as software-intensive systems. Companies seek to improve the software development process to avoid delays or cost overruns related to the software development.  

    Objective: The overall goal of this thesis is to improve the software development/building process to provide timely, high quality and cost efficient solutions. The objective is to select the origin of the components (in-house, outsource, components off-the-shelf (COTS) or open source software (OSS)) that facilitates the improvement. The system can be built of components from one origin or a combination of two or more (or even all) origins. Selecting a proper origin for a component is important to get the most out of a component and to optimize the development. 

    Method: It is necessary to investigate the component origins to make decisions to select among different origins. We conducted a case study to explore the existing challenges in software development.  The next step was to identify factors that influence the choice to select among different component origins through a systematic literature review using a snowballing (SB) strategy and a database (DB) search. Furthermore, a Bayesian synthesis process is proposed to integrate the evidence from literature into practice.  

    Results: The results of this thesis indicate that the context of software-intensive systems such as domain regulations hinder the software development improvement. In addition to in-house development, alternative component origins (outsourcing, COTS, and OSS) are being used for software development. Several factors such as time, cost and license implications influence the selection of component origins. Solutions have been proposed to support the decision-making. However, these solutions consider only a subset of factors identified in the literature.   

    Conclusions: Each component origin has some advantages and disadvantages. Depending on the scenario, one component origin is more suitable than the others. It is important to investigate the different scenarios and suitability of the component origins, which is recognized as future work of this thesis. In addition, the future work is aimed at providing models to support the decision-making process.

  • 996.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, School of Computing.
    Fricker, Samuel
    Blekinge Institute of Technology, School of Computing.
    Moreno, Ana
    Perspectives on Productivity and Delays in Large-Scale Agile Projects2013Conference paper (Refereed)
    Abstract [en]

    Many large and distributed companies run agile projects in development environments that are inconsistent with the original agile ideas. Problems that result from these inconsistencies can affect the productivity of development projects and the timeliness of releases. To be effective in such contexts, the agile ideas need to be adapted. We take an inductive approach for reaching this aim by basing the design of the development process on observations of how context, practices, challenges, and impacts interact. This paper reports the results of an interview study of five agile development projects in an environment that was unfavorable for agile principles. Grounded theory was used to identify the challenges of these projects and how these challenges affected productivity and delays according to the involved project roles. Productivity and delay-influencing factors were discovered that related to requirements creation and use, collaboration, knowledge management, and the application domain. The practitioners’ explanations about the factors' impacts are, on one hand, a rich empirical source for avoiding and mitigating productivity and delay problems and, on the other hand, a good starting point for further research on flexible large-scale development.

  • 997.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Bayesian Synthesis for Knowledge Translation in Software Engineering: Method and Illustration2016In: 2016 42th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), IEEE, 2016Conference paper (Refereed)
    Abstract [en]

    Systematic literature reviews in software engineering are necessary to synthesize evidence from multiple studies to provide knowledge and decision support. However, synthesis methods are underutilized in software engineering research. Moreover, translation of synthesized data (outcomes of a systematic review) to provide recommendations for practitioners is seldom practiced. The objective of this paper is to introduce the use of Bayesian synthesis in software engineering research, in particular to translate research evidence into practice by providing the possibility to combine contextualized expert opinions with research evidence. We adopted the Bayesian synthesis method from health research and customized it to be used in software engineering research. The proposed method is described and illustrated using an example from the literature. Bayesian synthesis provides a systematic approach to incorporate subjective opinions in the synthesis process thereby making the synthesis results more suitable to the context in which they will be applied. Thereby, facilitating the interpretation and translation of knowledge to action/application. None of the synthesis methods used in software engineering allows for the integration of subjective opinions, hence using Bayesian synthesis can add a new dimension to the synthesis process in software engineering research.

  • 998.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Gorschek, Tony
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Guidelines for Knowledge Translation in Software EngineeringIn: Article in journal (Refereed)
  • 999.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wohlin, Claes
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Petersen, Kai
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Experiences from Using Snowballing and Database Searches in Systematic Literature Studies2015Conference paper (Refereed)
    Abstract [en]

    Background: Systematic literature studies are commonly used in software engineering. There are two main ways of conducting the searches for these type of studies; they are snowballing and database searches. In snowballing, the reference list (backward snowballing - BSB) and citations (forward snowballing - FSB) of relevant papers are reviewed to identify new papers whereas in a database search, different databases are searched using predefined search strings to identify new papers. Objective: Snowballing has not been in use as extensively as database search. Hence it is important to evaluate its efficiency and reliability when being used as a search strategy in literature studies. Moreover, it is important to compare it to database searches. Method: In this paper, we applied snowballing in a literature study, and reflected on the outcome. We also compared database search with backward and forward snowballing. Database search and snowballing were conducted independently by different researchers. The searches of our literature study were compared with respect to the efficiency and reliability of the findings. Results: Out of the total number of papers found, snowballing identified 83% of the papers in comparison to 46% of the papers for the database search. Snowballing failed to identify a few relevant papers, which potentially could have been addressed by identifying a more comprehensive start set. Conclusion: The efficiency of snowballing is comparable to database search. It can potentially be more reliable than a database search however, the reliability is highly dependent on the creation of a suitable start set.

  • 1000.
    Badampudi, Deepika
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Claes, Wohlin
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Kai, Petersen
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Software Component Decision-making: In-house, OSS, COTS or Outsourcing: A Systematic Literature Review2016In: Journal of Systems and Software, ISSN 0164-1212, E-ISSN 1873-1228, Vol. 121, p. 105-124Article in journal (Refereed)
    Abstract [en]

    Component-based software systems require decisions on component origins for acquiring components. A component origin is an alternative of where to get a component from. Objective: To identify factors that could influence the decision to choose among different component origins and solutions for decision-making (For example, optimization) in the literature. Method: A systematic review study of peer-reviewed literature has been conducted. Results: In total we included 24 primary studies. The component origins compared were mainly focused on in-house vs. COTS and COTS vs. OSS. We identified 11 factors affecting or influencing the decision to select a component origin. When component origins were compared, there was little evidence on the relative (either positive or negative) effect of a component origin on the factor. Most of the solutions were proposed for in-house vs. COTS selection and time, cost and reliability were the most considered factors in the solutions. Optimization models were the most commonly proposed technique used in the solutions. Conclusion: The topic of choosing component origins is a green field for research, and in great need of empirical comparisons between the component origins, as well of how to decide between different combinations of them.

    The full text will be freely available from 2019-11-01 12:16
17181920212223 951 - 1000 of 13202
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf