Ändra sökning
Avgränsa sökresultatet
1234567 51 - 100 av 1344
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Träffar per sida
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
Markera
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 51.
    Ali, Wajahat
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Muhammad, Asad
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Response Time Effects on Quality of Security Experience2012Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    The recent decade has witnessed an enormous development in internet technology worldwide. Initially internet was designed for applications such as Electronic Mail and File Transfer. With technology evolving and becoming popular, people use internet for e-banking, e-shopping, social networking, e-gaming, voice and a lot of other applications. Most of the internet traffic is generated by activities of end users, when they request a specific webpage or web based application. The high demand for internet applications has driven service operators to provide reliable services to the end user and user satisfaction has now become a major challenge. Quality of Service is a measure of the performance of a particular service. Quality of Experience is a subjective measure of user’s perception of the overall performance of network. The high demand for internet usage in everyday life has got people concerned about security of information over web pages that require authentication. User perceived Quality of Security Experience depends on Quality of Experience and Response Time for web page authentication. Different factors such as jitter, packet loss, delay, network speed, supply chains and the type of security algorithm play a vital role in the response time for authentication. In this work we have tried to do qualitative and quantitative analysis of user perceived security and Quality of Experience with increasing and decreasing Response Times towards a web page authentication. We have tried to derive a relationship between Quality of Experience of security and Response Time.

  • 52.
    Ali, Waqas
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Case Study Of Mobile Internet User Experience2012Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Mobile Internet is currently considered as the future for the Internet. Apparently the number of mobile handset sold as compared to desktop PCs is noticeable. These hints depict the potential of mobile Internet and the future market strongly relying on mobile devices. But at the same time mobile internet users are growing slower in numbers. Particularly in market where the internet access is very simple through computers, mobile internet users seems not very enthusiastic to use internet on mobile phones. Author of this study supposed on the basis of literature findings that this lack of interest is due to an unsatisfactory mobile internet user experience. This thesis work is an effort into the complex area of mobile internet and shed some light on how to improve user experience for mobile internet. The main focus of this research work is the identification of hurdles/challenges for mobile internet user experience and explores the concepts present in academia. In order to understand it properly, the author performed a systematic literature review (SLR). The overall objective of SLR is to examine the existing work on thesis study topic. This in depth study of literature revealed that mobile internet user experience is categorized into aspects, elements and factors by different researchers and considered as a central part of mobile internet user experience. There are few other factors that affect and make this job complicated and difficult such as usage context and user expectations. In this work current problems of the mobile internet user experience are identified systematically that never happened before and then discussed in a way that provide a better understanding of mobile internet user experience to academia. To fulfill the aim and objectives author of this study conducted the detailed systematic review analysis of the empirical studies from year 1998 to 2012. The research studies were identified from the most authentic databases that are scientifically and technically peer reviewed such as Scopus, Evillage, IEEE Xplore, ACM digital library. From SLR results, we have found different aspects, elements, factors and challenges of mobile internet user experience. The most common challenge faced by user and reported in academia was screen size, input facilities, usability of services, and data traffic costs. The information attained during this thesis study through academia (literature) is presented in a descriptive way which reflects that there is an emerging trend of using internet on mobile devices. Through this study author presented the influencing perspective of mobile internet user experience that needs to be considered for the advancement of mobile internet. The presented work adds contribution in a sense as to the best of knowledge no systematic review effort has been done in this area.

  • 53.
    Ali, Zahoor
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Arfeen, Muhammad Qummer ul
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    The role of Machine Learning in Predicting CABG Surgery Duration2011Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Context. Operating room (OR) is one of the most expensive resources of a hospital. Its mismanagement is associated with high costs and revenues. There are various factors which may cause OR mismanagement, one of them is wrong estimation of surgery duration. The surgeons underestimate or overestimate surgery duration which causes underutilization or overutilization of OR and medical staff. Resolving the issue of wrong estimate can result improvement of the overall OR planning. Objectives. In this study we investigate two different techniques of feature selection, compare different regression based modeling techniques for surgery duration prediction. One of these techniques (with lowest mean absolute) is used for building a model. We further propose a framework for implementation of this model in the real world setup. Results. In our case the selected technique (correlation based feature selection with best first search in backward direction) for feature selection could not produce better results than the expert’s opinion based approach for feature selection. Linear regression outperformed on both the data sets. Comparatively the mean absolute error of linear regression on experts’ opinion based data set was the lowest. Conclusions. We have concluded that patterns exist for the relationship of the resultant prediction (surgery duration) and other important features related to patient characteristics. Thus, machine learning tools can be used for predicting surgery duration. We have also concluded that the proposed framework may be used as a decision support tool for facilitation in surgery duration prediction which can improve the planning of ORs and their resources.

  • 54.
    Alipour, Philip Baback
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Ali, Muhammad
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    An Introduction and Evaluation of a Lossless Fuzzy Binary AND/OR Compressor2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Vi rapporterar en ny förlustfri komprimering algoritm (MUL) för att genomföra förutsägbart-fast komprimering värden. Den luddiga binär och-eller algoritm (FBAR), syftar bland annat att införa en ny modell för regelbunden och superdense kodning i klassiska och kvantmekaniska information teori. Klassiska kodning på x86-maskiner inte skulle räcka teknik för maximal LDC att skapa fasta värden av Cr >= 2:1. Men den nuvarande modellen utvärderas för att tjäna flerdimensionella LDC med fast värde generationer, där de populära metoder som används i probabilistiska LDC, såsom Shannon entropi. De närvarande in entropi är av "fuzzy binära" i en 4D blixtkub lite flagga modell, med en produkt värde av minst 50% komprimering. Vi har genomfört komprimering och simulerade den tryckfall fasen för förlustfri versioner av FBAR logik. Jämförde vi ytterligare vår algoritm med de resultat som andra kompressorer. Vår statistiska testet visar att den presenterade algoritmen mutably och betydligt konkurrerar med andra LDC algoritmer på båda, tidsmässiga och geografiska faktorer av kompression. Den nuvarande algoritmen är en steppingstone att kvantinformationsteknik modeller lösa komplexa negativa entropies, vilket ger dubbel-effektiva LDC> 87,5 besparingar utrymme.

  • 55.
    Allahyari, Hiva
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    On the concept of Understandability as a Property of Data mining Quality2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    This paper reviews methods for evaluating and analyzing the comprehensibility and understandability of models generated from data in the context of data mining and knowledge discovery. The motivation for this study is the fact that the majority of previous work has focused on increasing the accuracy of models, ignoring user-oriented properties such as comprehensibility and understandability. Approaches for analyzing the understandability of data mining models have been discussed on two different levels: one is regarding the type of the models’ presentation and the other is considering the structure of the models. In this study, we present a summary of existing assumptions regarding both approaches followed by an empirical work to examine the understandability from the user’s point of view through a survey. From the results of the survey, we obtain that models represented as decision trees are more understandable than models represented as decision rules. Using the survey results regarding understandability of a number of models in conjunction with quantitative measurements of the complexity of the models, we are able to establish correlation between complexity and understandability of the models.

  • 56. Allahyari, Hiva
    et al.
    Lavesson, Niklas
    User-oriented Assessment of Classification Model Understandability2011Konferensbidrag (Refereegranskat)
    Abstract [en]

    This paper reviews methods for evaluating and analyzing the understandability of classification models in the context of data mining. The motivation for this study is the fact that the majority of previous work has focused on increasing the accuracy of models, ignoring user-oriented properties such as comprehensibility and understandability. Approaches for analyzing the understandability of data mining models have been discussed on two different levels: one is regarding the type of the models’ presentation and the other is considering the structure of the models. In this study, we present a summary of existing assumptions regarding both approaches followed by an empirical work to examine the understandability from the user’s point of view through a survey. The results indicate that decision tree models are more understandable than rule-based models. Using the survey results regarding understandability of a number of models in conjunction with quantitative measurements of the complexity of the models, we are able to establish correlation between complexity and understandability of the models.

  • 57. Al-Mamun, Abdullah
    et al.
    Ullah, Mohammad Rafiq
    Multitaper spectrum: A promising method in spectrum sensing cognitive radio2010Konferensbidrag (Refereegranskat)
    Abstract [en]

    Cognitive radio (CR) has been proposed as a promising and effective technology to improve radio spectrum utilization. The primary objective of the CR is to handle the non-interference rules with any primary users (PUs). Highly sensitive and optimal spectrum sensing detectors are required in order to avoid harmful interference to PUs. Multitaper spectrum seems to be the most appealing one for spectrum sensing CR because of its accurate identification and estimation and low computational complexity. Mulitaper uses small set of tapers and multiple orthogonal prototype filters to reduce the variance. The Fourier transform of a Slepian sequence, originally known as discrete prolate spheroidal sequences (DPSS), gives maximum energy density inside a given bandwidth and less spectral leakage with better specifications has been investigated in this paper and shows that no other window in signal processing can satisfy this property

  • 58.
    Almrot, Emil
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Andersson, Sebastian
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    A study of the advantages & disadvantages of mobile cloud computing versus native environment2013Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    En undersökning av fördelar och nackdelar med programvara som utnyttjar mobil cloud computing jämfört med traditionell miljö. Arbetet innefattar ett tekniskt experiment där mätning av strömåtgång för applikationer i respektive kategori gjorts. Arbetet visar tecken på att mobil cloud computing-teknik har mognat och blivit en mycket mer hållbar lösning.

  • 59. Al-Qahtani, Fawaz S.
    et al.
    Duong, Quang Trung
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Zhong, Caijun
    Qaraqe, Khaild A.
    Alnuweiri, Hussein
    Performance Analysis of Dual-Hop AF Systems with Interference in Nakagami-m Fading Channels2011Ingår i: IEEE Signal Processing Letters, ISSN 1070-9908, Vol. 18, nr 8, s. 454-457Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    In this paper, we investigate the performance of dual-hop channel state information-assisted amplify-and-forward relaying systems over Nakagami-m fading channels in the presence of multiple interferers at the relay. Assuming integer fading parameter m, we derive closed-form expressions for the exact outage probability and accurate approximation for symbol error rate of the system. Furthermore, we look into the asymptotical high signal to noise ratio regime, and characterize the diversity order achieved by the system. All the analytical results are validated via Monte Carlo simulations.

  • 60.
    Al-Refai, Ali
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Pandiri, Srinivasreddy
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Cloud Computing: Trends and Performance Issues2011Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Context: Cloud Computing is a very fascinating concept these days, it is attracting so many organiza-tions to move their utilities and applications into a dedicated data centers, and so it can be accessed from the Internet. This allows the users to focus solely on their businesses while Cloud Computing providers handle the technology. Choosing a best provider is a challenge for organizations that are willing to step into the Cloud Computing world. A single cloud center generally could not deliver large scale of resources for the cloud tenants; therefore, multiple cloud centers need to collaborate to achieve some business goals and to provide the best possible services at lowest possible costs. How-ever a number of aspects, legal issues, challenges, and policies should be taken into consideration when moving our service into the Cloud environment. Objectives: The aim of this research is to identify and elaborate the major technical and strategy differences between the cloud-computing providers in order to enable the organizations managements, system designers and decision makers to have better insight into the strategies of the different Cloud Computing providers. It is also to understand the risks and challenges due to implementing Cloud Computing, and “how” those issues can be moderated. This study will try to define Multi-Cloud Computing by studying the pros and cons of this new domain. It is also aiming to study the concept of load balancing in the cloud in order to examine the performance over multiple cloud environments. Methods: In this master thesis a number of research methods are used, including the systematic litera-ture review, contacting experts from the relevant field (Interviews) and performing a quantitative methodology (Experiment). Results: Based on the findings of the Literature Review, Interviews and Experiment, we got out the results for the research questions as, 1) A comprehensive study for identifying and comparing the major Cloud Computing providers, 2) Addressing a list of impacts of Cloud Computing (legal aspects, trust and privacy). 3) Creating a definition for Multi-Cloud Computing and identifying the benefits and drawbacks, 4) Finding the performance results on the cloud environment by performing an expe-riment on a load balancing solution. Conclusions: Cloud Computing becomes a central interest for many organizations nowadays. More and more companies start to step into the Cloud Computing service technologies, Amazon, Google, Microsoft, SalesForce, and Rackspace are the top five major providers in the market today. However, there is no Cloud that is perfect for all services. The legal framework is very important for the protection of the user’s private data; it is an important key factor for the safety of the user’s personal and sensitive information. The privacy threats vary according to the nature of the cloud scenario, since some clouds and services might face a very low privacy threats compare to the others, the public cloud that is accessed through the Internet is one of the most means when it comes the increasing threats of the privacy concerns. Lack of visibility of the provider supply chain will lead to suspicion and ultimately distrust. The evolution of Cloud Computing shows that it is likely, in a near future, the so-called Cloud will be in fact a Multi-cloud environment composed of a mixture of private and public Clouds to form an adaptive environment. Load balancing in the Cloud Computing environment is different from the typical load balancing. The architecture of cloud load balancing is using a number of commodity servers to perform the load balancing. The performance of the cloud differs depending on the cloud’s location even for the same provider. HAProxy load balancer is showing positive effect on the cloud’s performance at high amount of load, the effect is unnoticed at lower amounts of load. These effects can vary depending on the location of the cloud.

  • 61.
    Amiri, Javad Mohammadian
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Padmanabhuni, Venkata Vinod Kumar
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    A Comprehensive Evaluation of Conversion Approaches for Different Function Points2011Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Context: Software cost and effort estimation are important activities for planning and estimation of software projects. One major player for cost and effort estimation is functional size of software which can be measured in variety of methods. Having several methods for measuring one entity, converting outputs of these methods becomes important. Objectives: In this study we investigate different techniques that have been proposed for conversion between different Functional Size Measurement (FSM) techniques. We addressed conceptual similarities and differences between methods, empirical approaches proposed for conversion, evaluation of the proposed approaches and improvement opportunities that are available for current approaches. Finally, we proposed a new conversion model based on accumulated data. Methods: We conducted a systematic literature review for investigating the similarities and differences between FSM methods and proposed approaches for conversion. We also identified some improvement opportunities for the current conversion approaches. Sources for articles were IEEE Xplore, Engineering Village, Science Direct, ISI, and Scopus. We also performed snowball sampling to decrease chance of missing any relevant papers. We also evaluated the existing models for conversion after merging the data from publicly available datasets. By bringing suggestions for improvement, we developed a new model and then validated it. Results: Conceptual similarities and differences between methods are presented along with all methods and models that exist for conversion between different FSM methods. We also came with three major contributions for existing empirical methods; for one existing method (piecewise linear regression) we used a systematic and rigorous way of finding discontinuity point. We also evaluated several existing models to test their reliability based on a merged dataset, and finally we accumulated all data from literature in order to find the nature of relation between IFPUG and COSMIC using LOESS regression technique. Conclusions: We concluded that many concepts used by different FSM methods are common which enable conversion. In addition statistical results show that the proposed approach to enhance piecewise linear regression model slightly increases model’s test results. Even this small improvement can affect projects’ cost largely. Results of evaluation of models show that it is not possible to say which method can predict unseen data better than others and it depends on the concerns of practitioner that which model should be used. And finally accumulated data confirms that empirical relation between IFPUG and COSMIC is not linear and can be presented by two separate lines better than other models. Also we noted that unlike COSMIC manual’s claim that discontinuity point should be around 200 FP, in merged dataset discontinuity point is around 300 to 400. Finally we proposed a new conversion approach using systematic approach and piecewise linear regression. By testing on new data, this model shows improvement in MMRE and Pred(25).

  • 62.
    Ande, Rama kanth
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Amarawadi, Sharath Chandra
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Evaluation of ROS and Arduino Controllers for the OBDH Subsystem of a CubeSat2012Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    One of the future trend setting research areas in CubeSat projects is the implementation of ROS in CubeSat. Robot Operating System (ROS) is aiming to capture the future of many embedded systems including Robotics. In this thesis, an attempt is made to understand the challenges faced during implementing ROS in CubeSat to provide a foundation for the OBDH subsystem and provide important guidelines for future developers relying on ROS run CubeSats. Since using traditional transceivers and power supply would be expensive, we have tried simulating Arduino to act as transceiver and power supply subsystems. Arduino is an open-source physical computing platform based on a simple microcontroller board, and a development environment for writing software for the board designed to make the process of using electronics in major embedded projects more accessible and inexpensive.

  • 63.
    Andersson, Alve
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Att sticka ut i mängden: En studie av tekniker för variation av instansierade modeller2013Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Trots den senaste tidens hårdvaruutveckling är realtidsrendering av stora folkmassor fortfarande ingen trivial uppgift. Denna uppgift beskrivs som crowd rendering. Effektiv crowd rendering bygger ofta på instansiering, men instansiering kommer med ett problem, det skapar kloner. Denna uppsats syftar till att undersöka och utvärdera ett antal tekniker som används för att skapa mångfald för instansierade modeller. Dessa tekniker kommer tillsammans att kallas varierad instansiering. Ett annat mål är att avgöra hur många modeller som behövs för att varierad instansiering skall betala sig i jämförelse med icke- instansierad utritning. Metoden som används är att mäta tiden för varje uppdatering på GPU för varje teknik med hjälp av ett mätinstrument. Varje teknik har implementerats i en applikation som skapats speciellt för detta ändamål. Analysen av mätningarna resulterade i tre kategorier. Kategorierna är GPU procentuell arbetsbörda stigande för instans avtagande för polygon, sjunkande för instans avtagande för polygon och jämn för instans och polygon. Antalet instanser som behövs för varierad instansiering skall betala sig i jämförelse med en icke- instansierad utritning bestämdes till någonstans mellan 100 och 300 modeller, beroende på antalet polygoner.

  • 64. Andersson, Emma
    et al.
    Peterson, Anders
    Törnquist Krasemann, Johanna
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Quantifying railway timetable robustness in critical points2013Ingår i: Journal of Rail Transport Planning and Management, ISSN 2210-9706, Vol. 3, nr 3, s. 95-110Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Several European railway traffic networks experience high capacity consumption during large parts of the day resulting in delay-sensitive traffic system with insufficient robustness. One fundamental challenge is therefore to assess the robustness and find strategies to decrease the sensitivity to disruptions. Accurate robustness measures are needed to determine if a timetable is sufficiently robust and suggest where improvements should be made.Existing robustness measures are useful when comparing different timetables with respect to robustness. They are, however, not as useful for suggesting precisely where and how robustness should be increased. In this paper, we propose a new robustness measure that incorporates the concept of critical points. This concept can be used in the practical timetabling process to find weaknesses in a timetable and to provide suggestions for improvements. In order to quantitatively assess how crucial a critical point may be, we have defined the measure robustness in critical points (RCP). In this paper, we present results from an experimental study where a benchmark of several measures as well as RCP has been done. The results demonstrate the relevance of the concept of critical points and RCP, and how it contributes to the set of already defined robustness measures

  • 65.
    Andersson, Filip
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Norberg, Simon
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Scalable applications in a distributed environment2011Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    As the amount of simultaneous users of distributed systems increase, scalability is becoming an important factor to consider during software development. Without sufficient scalability, systems might have a hard time to manage high loads, and might not be able to support a high amount of users. We have determined how scalability can best be implemented, and what extra costs this leads to. Our research is based on both a literature review, where we have looked at what others in the field of computer engineering thinks about scalability, and by implementing a highly scalable system of our own. In the end we came up with a couple of general pointers which can help developers to determine if they should focus on scalable development, and what they should consider if they choose to do so.

  • 66.
    Andersson, Lars
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Inblick i fenomenet webbskrapning2013Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Föreliggande kandidatarbete har till syfte att undersöka fenomenet Webskrapning. Webbskrapnings-program (också kända som Web Wanderers, Crawlers, Spiders eller skrapare) är program som söker igenom webben automatiskt för att extrahera information från webbsidor. Ett exempel på web skrapning är när ett företag samlar in data om prissättningar på en vara eller en tjänst och sen använder informationen för att producera billigare erbjudanden. Detta ger företaget en fördel så att de kan fokusera mera på att marknadsföra sin sida/tjänster. Utöver detta så blir de utsatta företagens servrar också hårt belastade med trafik (skrapning) från ”icke kunder”. Efter att ha genomsökt både akademiska och allmänna källor via informationsinsamling, av denna information så dras slutsatsen att man inte fullt ut kan hindra skrapning av hemsidor. Detta på samma sätt som man inte fullt ut kan hindra någon IT-attack, det finns inga 100 % vattentäta system. Av utfallet ifrån informationssökningen var det bara ett akademiskt arbete, av de hundra, som genomsöktes som hade inriktat sig på att förhindra skrapningsbotar.

  • 67.
    Andersson, Måns
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Data Compression for use in the Short Messaging System2010Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Datakompression är ett brett område med ett stort antal olika algoritmer. Alla algoritmer är inte bra för alla tillfällen och denna rapport tittar i huvudsak på kompression av små filer i intervallet 100-300 byte tänkta att skickas komprimerade över SMS. Ett antal välkända algoritmers kompressionsgrad är testade och två av dem, Algorithm Λ och Adaptiv Aritmetisk Kodning, väljs ut och studeras närmre samt implementeras i Java. Dessa implementationer är sedan testade tillsammans med tidigare testade implementationer och en av algoritmerna väljs ut för att besvara frågan "Vilken kompressionsalgoritm är best lämpad för att komprimerad data för användning i SMS-meddelanden?".

  • 68.
    Andersson, Patrik
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Voxelbaserad rendering med "Marching Cubes"-algoritmen2009Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Det finns flera olika metoder och tekniker för tredimensionell rendering, alla med olika för- och nackdelar som lämpar sig för olika applikationer. Voxelbaserad rendering har använts flitigt inom vetenskapliga områden, främst inom det medicinska för visualisering av volymetrisk data. Tekniken används nu inom flera olika områden för tredimensionell rendering, t.ex. i datorspel, i matematiska applikationer och vid geologisk rekonstruktion. I den här rapporten kommer voxelbaserad rendering med Marching Cubes-algoritmen undersökas för att se hur den lämpar sig för realtidsapplikationer. Området behandlas dels teoretiskt, men även praktiskt då en implementering av Marching Cubes gjordes för att genomföra några tester för att se hur prestandan påverkades. Av testerna framkom det tydligt att algoritmen lämpar sig väl för realtidsapplikationer och dagens grafikkort. Viss optimering krävs dock för att kunna utnyttjas på bästa sätt.

  • 69.
    Andersson, Patrik
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Johansson, Sakarias
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Rendering with Marching Cubes, looking at Hybrid Solutions2012Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Marching Cubes is a rendering technique that has many advantages for a lot of areas. It is a technique for representing scalar fields as a three-dimensional mesh. It is used for geographical applications as well as scientific ones, mainly in the medical industry to visually render medical data of the human body. But it's also an interesting technique to explore for the usage in computer games or other real-time applications since it can create some really interesting rendering. The main focus in this paper is to present a novel hybrid solution using marching cubes and heightmaps to render terrain; moreover, to find if it’s suitable for real-time applications. The paper will follow a theoretical approach as well as an implementational one on the hybrid solution. The results across several tests for different scenarios show that the hybrid solution works well for today's real-time applications using a modern graphics card and CPU (Central Processing Unit).

  • 70.
    Andersson, Petter
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Ericsson, Eric
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Jämförelse av funktionsbibliotek för JavaScript2011Självständigt arbete på grundnivå (kandidatexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    När de traditionella skrivbordsapplikationerna görs om till webbapplikationer, ställer det krav på JavaScript som används mer och mer för att få fram ett responsivt gränssnitt på webben. För att underlätta utvecklingen av JavaScriptapplikationer har ett antal funktionsbibliotek skapats. I vår studie undersöker vi därför vilket av de två populäraste JavaScriptbiblioteken idag, jQuery och Prototype, som presterar bäst i dagens mest använda webbläsare. Dessa tester har utförts i ett testramverk som vi själva utvecklat för att vara webbläsaroberoende och inte kräva något av de bibliotek vi testar. Testerna är uppdelade i fyra testfall som körs 20 gånger för att ge ett mer tillförlitligt resultat. Vi har testat hur varje bibliotek hanterar traversering och manipulation av DOM-trädet, sätter och hämtar stilar och attribut på element i DOM-trädet och hanterar event på element i DOM-trädet. Testerna visade att biblioteket Prototype presterade bättre på alla utom ett testfall i majoriteten av våra utvalda webbläsare; det enda testfallet där jQuery presterade bättre än Prototype var där DOM-trädet skulle manipuleras. Trots att Prototype inte alls är lika omtalat som jQuery, verkar det vara ett bättre bibliotek att använda till webbapplikationer som ska ha ett interaktivt gränssnitt då det i flertalet av våra tester presterar bättre.

  • 71.
    Ansari, Rehan Javed.
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Dodda, Sandhya Rani.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    The Use of SCRUM in Global Software Development – An Empirical Study2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    The trend for global software development is increasing day by day. Global software development takes into account, the development of software globally by bringing knowledge about the market. There are several challenges that have an impact in developing software globally. In this study we investigate several management challenges faced in globally distributed projects and scrum practices that are being implemented by the organization. We also study the benefits in implementing scrum. For our research, we have performed literature review to find out the various challenges in managing globally distributed software projects and various scrum practices that are discussed. We conducted industrial case studies to find out the challenges being faced by them in globally distributed projects and the various scrum practices that are followed by them to overcome those challenges and also to know the benefits of implementing scrum in GSD. In order to provide quantitative support of management challenges and scrum practices discussed in the literature review, surveys have been conducted. We used grounded theory for analyzing the data gathered during the study. There are several challenges that are being faced by the organizations while developing software globally. There are also several scrum practices that have been found from the interviews. There are few challenges that need to be addressed in future research.

  • 72.
    Antkowiak, Łukasz
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Parallel algorithms of timetable generation2013Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Sammanhang: De flesta problem med att generera scheman för en skola tillhör klassen av NP-svårt problemen. Komplexitet och praktiskt värde gör att den här typen av problemen forskas med särskild uppmärksamhet på en parallell bearbetning.   Syfte: Detta dokument fokusarar på Klass-Lärare problem med vikter för enskilda tidsluckor och på att visa var ett NP-svårt problem är fullständigt. Branch and bound scheman och två metoder för att distribuera en simulerad glödgning algoritm presenterades. En empirisk analys av beskrivna metoder gjordes i datorlaboratorium i en grundskola. Metod: Implementering av en simulerad glödgning algoritm som beskrivs i litteraturen blev anpassad till ett utvalt problem och distribuerade system. Empirisk utvärdering genomförs med verkliga data från polska grundskolan Resultat: Föreslagit Branch and bound system graderar nästan logaritmiskt antal noder i ett datorkluster. Den simulerade glödgning algoritmen som föreslagits förbättrar lösningarnas kvalitet. Slutsatser: Trots att en betydande ökning med beräkningskraft är inte datasalar i skolor anpassad till avancerade beräkningar. Användning av den Branch and Bound föreslagna metoden till praktiska problem är omöjlig i praktiken. En annan föreslagen metod Parallel Moves ger bättre resultat i början av utförandet men Multiple Independent Runs hittar bättre lösningar efter en viss tid.

  • 73.
    Anwar, Khurshid
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Khan, Asad
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    AC and QAR for Provisioning of QoS in MANETs2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    The literature study shows that the performance of network layer best-effort protocols has been improved with the use of QAR and AC protocols to sustain the QoS requirements of the applications. In current literature AC and QAR protocols are satisfying single metric of QoS. At the same time different applications such as multimedia applications which require various types of assurance from the network on the Quality of services (QoS) The simulation results show that DSR performance is better than AODV when we have less traffic load.

  • 74.
    Anwar, Mahwish
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Virtual Firewalling For Migrating Virtual Machines In Cloud Computing2013Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Context. Cloud Computing (CC) uses virtualization to provide computing resources on demand via Internet. Small and large organizations benefit from CC because of reduced operating costs and increase in business agility. The migrating Virtual Machine (VM) is vulnerable from attacks such as fake migration initiations, service interruptions, manipulation of data or other network attacks. During live migration any security lax in VM firewall policy can put the VM data, OS and the applications on it at risk. A malicious VM can pose threat to other VMs in its host and consequently for VMs in LAN. Hardware firewalls only protect VM before and after migration. Plus, they are blind to virtual traffic. Hence, virtual firewalls (VFs) are used to secure VMs. Mostly; they are deployed at Virtual Machine Monitor-level (VMM) under Cloud provider’s control. Source VMM-level VF provides security to VM before the migration incurs and the destination VMM-level VF starts securing VM after migration is completed. It thus, becomes possible for attacker to use the intermediate migrating window to launch attacks on VM. Considering the potential of VFs there should be a great value in using open source VFs at VM-level for protecting VMs during migration, thereby, reducing the attacker’s slot to gain access to VM. It would enable hardened security for overall VM migration. Objectives. The aim is to investigate VM-level firewalling using open source firewall as a complementary security layer to VMM-level firewalling, to secure migrating VM in the CC domain. The first objective is to identify how virtual firewalls secure migrating VM in CC and to propose VM-level open-source virtual firewalling for protecting VM during migration. Later the VF is implemented to validate and evaluate its intactness or activeness during migration in real Cloud data center. Methods. In the literary review 9 electronic libraries are used, which include IEEE Xplore, ACM Digital Library, SCOPUS, Engineering Village and Web of Knowledge. Studies are selected after querying libraries for 2 key terms ‘virtual machine’ and ‘migration’ (along with other variations/synonyms), in the abstract. Relevant papers on the subject are read and analyzed. Finally, the information gaps are identified. Using a lacuna the experimental solution is designed. To test the potential of VF at VM-level for migrating VM’s security the experimental validation is performed using stratification samples of firewall rules. The VF evaluation is done using continuous ICMP echo packet transmission. The packets are analyzed to determine firewall behavior during migration. To evaluate the validity, the VM migration is performed 8 times in City Network data center. Results. The literary review identified the widespread use of VMM-level firewalling for migrating VM’s security in CC. The VM-level VFs were not researched nor evaluated for intactness during migration. The experiment performed at City Network demonstrated that the VM-level VF secures VM during migration (on average) for 96% of migration time, thereby reducing attack window for attacker during VM mobility. According to the results the average total migration time (TMT) was 16.6 s and average downtime (DT) of firewall was as low as 0.47 s, which means that VF at VM-level protects VM during entire migration span except when VM’s down (4% of migration time). Conclusions. The research concludes that VM-level firewalling using open source VF as an additional security layer in CC for VM migrations is feasible to employ and will enhance the migrating machine’s security by providing hardened firewall service during migration process, thus, reducing the potential attack window. VMM-level VF provides security in post and pre migration phase. Using VM-level VF as a complementary measure to VMM-level VF enables additional protection for VM migration process, thereby reducing the chances for attacker to attack VM during transition.

  • 75.
    Anwar, Naveed
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Kwoka, Adam
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Web Site Usability, Technical and Social Perspectives2012Självständigt arbete på avancerad nivå (magisterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Human factors and usability issues have traditionally played a limited role in security research and secure systems development. System designers have disregarded usability concerns for the reason that they are not acquainted with them or sometimes they do not take into account the importance of the human factors. Addressing issues of usability and human factors could be an important part of way out for today‟s security tribulations.There is increasing agreement that we need to design secure systems that people can actually use, but less agreement about how to reach this goal.

  • 76. Apell, Maria
    et al.
    Erman, David
    Popescu, Adrian
    Testbed for Advanced Mobile Solutions2010Konferensbidrag (Refereegranskat)
    Abstract [en]

    This paper describes the implementation of an IMS testbed, based on open source technologies and operating systems. The testbed provides rich communication services, i.e., Instant Messaging, Network Address Book and Presence as well as VoIP and PSTN interconnectivity. Our validation tests indicate that the performance of the testbed is comparable to similar testbeds, but that operating system virtualization signi ficantly aff ects signalling delays.

  • 77. Arkoulis, Stamatios
    et al.
    Marias, Giannis
    Frangoudis, Pantelis
    Oberender, Jens
    Popescu, Alexandru
    Fiedler, Markus
    Meer, Hermann de
    Polyzos, George
    Misbehaviour Scenarios in Cognitive Radio Networks2010Ingår i: Future Internet, ISSN 1999-5903, E-ISSN 1999-5903, Vol. 2, nr 3-4, s. 212-237Artikel, forskningsöversikt (Refereegranskat)
    Abstract [en]

    Recent advances in the fields of Cognitive Radio and the proliferation of open spectrum access promise that spectrum-agile wireless communication will be widespread in the near future, and will bring significant flexibility and potential utility improvements for end users. With spectrum efficiency being a key objective, most relevant research focuses on smart coexistence mechanisms. However, wireless nodes may behave selfishly and should be considered as rational autonomous entities. Selfishness, pure malice or even faulty equipment can lead to behavior that does not conform to sharing protocols and etiquette. Thus, there is a need to secure spectrum sharing mechanisms against attacks in the various phases of the sharing process. Identifying these attacks and possible countermeasures is the focus of this work.

  • 78. Arlos, Patrik
    Application Level Measurement2011Ingår i: Network Performance Engineering: A Handbook on Convergent Multi-Service Networks and Next Generation Internet, Berlin / Heidelberg: Springer , 2011, s. 14-36Kapitel i bok, del av antologi (Övrigt vetenskapligt)
    Abstract [en]

    In some cases, application-level measurements can be the only way for an application to get an understanding about the performance offered by the underlying network(s). It can also be that an application-level measurement is the only practical solution to verify the availability of a particular service. Hence, as more and more applications perform measurements of various networks; be that fixed or mobile, it is crucial to understand the context in which the application level measurements operate their capabilities and limitations. To this end in this paper we discuss some of the fundamentals of computer network performance measurements and in particular the key aspects to consider when using application level measurements to estimate network performance properties.

  • 79. Arlos, Patrik
    et al.
    Fiedler, Markus
    Influence of the Packet Size on the One-Way Delay in 3G Networks2010Konferensbidrag (Refereegranskat)
    Abstract [en]

    We currently observe a rising interest in mobile broadband, which users expect to perform in a similar way as its fixed counterpart. On the other hand, the capacity allocation process on mobile access links is far less transparent to the user; still, its properties need to be known in order to minimize the impact of the network on application performance. This paper investigates the impact of the packet size on the minimal one-way delay for the uplink in third-generation mobile networks. For interactive and real-time applications such as VoIP, one-way delays are of major importance for user perception; however, they are challenging to measure due to their sensitivity to clock synchronization. Therefore, the paper applies a robust and innovative method to assure the quality of these measurements. Results from measurements from several Swedish mobile operators show that applications can gain significantly in terms of one-way delay from choosing optimal packet sizes. We show that, in certain cases, an increased packet size can improve the one-way delay performance at best by several hundred milliseconds.

  • 80. Arlos, Patrik
    et al.
    Fiedler, Markus
    Influence of the Packet Size on the One-Way Delay on the Down-link in 3G Networks2010Konferensbidrag (Refereegranskat)
    Abstract [sv]

    Vi utvärderar hur OWD i down-link för 3G (WCDMA/HSDPA) nätverk påverkas av paketstorleken. Vi redogör data för tre Svenska operatörer.

  • 81. Arlos, Patrik
    et al.
    Kommalapati, Ravichandra
    Fiedler, Markus
    Evaluation of Protocol Treatment in 3G Networks2011Konferensbidrag (Refereegranskat)
    Abstract [en]

    In this work, we present a systematic study of how the traffic of different transport protocols (UDP, TCP and ICMP) is treated, in three operational Swedish 3G networks. This is done by studying the impact that protocol and packet size have on the one-way-delay (OWD) across the networks. We do this using a special method that allows us to calculate the exact OWD, without having to face the usual clock synchronization problems that are normally associated with OWD calculations. From our results we see that all three protocols are treated similarly by all three operators, when we consider packet sizes that are smaller than 250~bytes and larger than 1100~bytes. We also show that larger packet sizes are given preferential treatment, with both smaller median OWD as well as a smaller standard deviation. It is also clear that, ICMP is given a better performance compared to TCP and UDP.

  • 82.
    Arslan, Muhammad
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Riaz, Muhammad Assad
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    A Roadmap for Usability and User Experience Measurement during early phases of Web Applications Development2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Användbarhet och User Experience (UX) spelar en avgörande roll i framgång och misslyckande av webbapplikationer. Dock användbarhet och UX mätning under programvarans livscykel ge många utmaningar. Baserat på en systematisk litteraturstudie, denna uppsats diskuterar aktuella användbarhet och användarupplevelse utvärdering och mätmetoder och angivna åtgärder samt som deras tillämplighet under mjukvaruutveckling liv cykel. Utmaningarna i att använda dessa metoder identifierades också. För att utarbeta mer på de utmaningar, vi genomfört informella intervjuer inom ett mjukvaruföretag. Baserat på resultaten, definierade vi en användbarhet och användarcentrerad erfarenheter mätning och utvärdering färdplan för webben tillämpningar utvecklingsbolag. Färdplanen innehåller en uppsättning av användbarhet utvärdering och mätmetoder som samt de åtgärder som vi fann lämpliga att använda under tidigt skede (krav, design och utveckling) i webb applikationsutveckling livscykel. Att godkänna tillämpning av de definierade färdplanen, en fallstudie som utförts på ett realtid marknadsorienterad verklig egendom webbapplikation. Resultaten och diskussionerna i resultaten samt de framtida inriktningar forskning presenteras.

  • 83.
    Aruchamy, Logabharathi
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Analysis of Radio Access Network Buffer Filling Based on Real Network Data2012Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    The 3G and 4G networks have drastically improved availability and quality in data transmission for bandwidth hungry services such as video streaming and location-based services. As 3G networks are very widely deployed, there exists increased capacity requirement and transport channel allocation to simultaneous users under a particular cell. Due to this reason, adequate resources are not available, which in turn degrades both service quality and user experienced quality. This research aims at understanding the characteristics of buffer filling during dedicated channel (DCH) transmission under fixed bit-rate assumptions on a per-user level taking different services into consideration. Furthermore, the resource utilisation in terms of empty buffer durations and user throughput achieved during dedicated channel transmission are also analysed for different data services existing in the mobile networks. The traces are collected from a real network and characteristics of the traffic are analysed prior to understanding its buffer filling in Radio Network Controller (RNC) during downlink data transmission. Furthermore, the buffer is modelled with some series of assumptions on channel bit-rates and simulations are performed taking single user scenario into consideration, for different services with the help of obtained traces as input to the buffer. This research is helpful in understanding the RNC buffer filling for different services, in turn yielding possible understanding on the existing transport channel switching scenario. With the help of analysing the buffer filling for different services and transport channel utilisation, we learn that most of the data services show low DCH utilisation of approximately around 20% and also found to have 80% of the total DCH session duration with empty buffer, causing sub-optimal radio resource utilization.

  • 84.
    Arvidsson, Mattias
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Säkerhetsmedvetenhet: En kvantitativ studie beträffande mätning av säkerhetsmedvetenhet hos medarbetare2016Självständigt arbete på grundnivå (kandidatexamen), 10 poäng / 15 hpStudentuppsats (Examensarbete)
    Abstract [sv]

    Denna rapport syftar till att besvara tre frågeställningar; går det att mäta säkerhetsmedvetenhet, vilka variabler påverkar säkerhetsmedvetenheten samt är skillnaderna i variablerna signifikanta. Undersökningen har genomförts i en region och av yrkeskategorin brevbärare, inom företaget Posten Meddelande AB. För att besvara de tre frågeställningarna har en omfattande enkätundersökning genomförts på tio arbetsplatser, där ett dataunderlag på 164 svar inhämtats.  Underlaget har indexerats till ett index som är avsett att beskriva hur god säkerhetsmedvetenhet de svarande har. Säkerhetsmedvetenhetsindexet har sedan analyserats med olika sambandsanalyser för att se om de oberoende variablerna påverkar resultatet. Totalt har sju oberoende variabler analyserats, och av dessa har fem påverkat resultatet.  Till sist har dessa fem testats med medelvärdes- och variansanalyser för att se om skillnaderna i grupperna är signifikanta. Resultatet av det visar att skillnaderna är signifikanta och i princip alla med mer hög tillförlitlighet. Slutsatsen av dessa analyser är att äldre personer som jobbat länge inom Posten Meddelande och genomgått en säkerhetsutbildning har statistiskt säkerställt bättre säkerhetsmedvetenhet. Då ålder och erfarenhet är omöjligt att erhålla på kort sikt, anser författaren att den viktigaste framgångsfaktorn för att uppnå god säkerhetsmedvetenhet är att alla inom företaget bör genomgå en säkerhetsutbildning.

  • 85.
    Aryal, Dhiraj
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Shakya, Anup
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    A Taxonomy of SQL Injection Defense Techniques2011Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Context: SQL injection attack (SQLIA) poses a serious defense threat to web applications by allowing attackers to gain unhindered access to the underlying databases containing potentially sensitive information. A lot of methods and techniques have been proposed by different researchers and practitioners to mitigate SQL injection problem. However, deploying those methods and techniques without a clear understanding can induce a false sense of security. Classification of such techniques would provide a great assistance to get rid of such false sense of security. Objectives: This paper is focused on classification of such techniques by building taxonomy of SQL injection defense techniques. Methods: Systematic literature review (SLR) is conducted using five reputed and familiar e-databases; IEEE, ACM, Engineering Village (Inspec/Compendex), ISI web of science and Scopus. Results: 61 defense techniques are found and based on these techniques, a taxonomy of SQL injection defense techniques is built. Our taxonomy consists of various dimensions which can be grouped under two higher order terms; detection method and evaluation criteria. Conclusion: The taxonomy provides a basis for comparison among different defense techniques. Organization(s) can use our taxonomy to choose suitable owns depending on their available resources and environments. Moreover, this classification can lead towards a number of future research directions in the field of SQL injection.

  • 86.
    Asghar, Gulfam
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Azmi, Qanit Jawed
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Security Issues of SIP2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Voice over IP (VoIP) services based on Session Initiation Protocol (SIP) has gained much attention as compared to other protocols like H.323 or MGCP over the last decade. SIP is the most favorite signaling protocol for the current and future IP telephony services, and it‘s also becoming the real competitor for traditional telephony services. However, the open architecture of SIP results the provided services vulnerable to different types of security threats which are similar in nature to those currently existing on the Internet. For this reason, there is an obvious need to provide some kind of security mechanisms to SIP based VOIP implementations. In this research, we will discuss the security threats to SIP and will highlight the related open issues. Although there are many threats to SIP security but we will focus mainly on the session hijacking and DoS attacks. We will demonstrate these types of attacks by introducing a model/practical test environment. We will also analyze the effect and performance of some the proposed solutions that is the use of Network Address Translation (NAT), IPSec, Virtual Private Networks (VPNs) and Firewalls (IDS/IPS) with the help of a test scenario.

  • 87.
    Asghari, Negin
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Evaluating GQM+ Strategies Framework for Planning Measurement System2012Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Context. Most organizations are aware of the significance of software measurement programs to help organizations assess and improve the ways they develop software. Measurement plays a vital role in improving software process and products. However, the number of failing measurement programs is high and the reasons are vary. A recent approach for planning measurement programs is GQM+Strategies, which makes an important extension to existing approaches, it links measurements and improvement activities to strategic goals and ways to achieve this goals. However, concrete guides of how to collect the information needed to use GQM+strategies is not provided in the literature yet. Objectives. The contribution of this research is to propose and assess an elicitation approach (The Goal Strategy Elicitation (GSE) approach) for the information needed to apply GQM+strategies in an organization, which also leads to a partial evaluation of GQM+strategies as such. In this thesis, the initial focus is placed on eliciting the goals and strategies in the most efficient way. Methods. The primary research approach used is action research, which allows to flexibly assess a new method or technique in an iterative manner, where the feedback of one iteration is taken into the next iteration, thus improving on the method or technique proposed. Complementary to that, we used literature review with the primary focus to position the work, explore GQM+strategies, and to determine which elicitation approach for the support of measurement programs have been proposed. Results. The Goal Strategy Elicitation (GSE) approach as a tool for eliciting goals and strategies within the software organization to contribute in planning a measurement program has been developed. The iterations showed that the approach of elicitation may not be too structured (e.g. template/notation based), but rather shall support the stakeholders to express their thoughts relatively freely. Hence, the end-result was an interview guide, not based on notations (as in the first iteration), and asking questions in a way that the interviewees are able to express themselves easily without having to e.g. distinguish definitions for goals and strategies. Conclusions. We conclude that the GSE approach is a strong tool for the software organization to be able to elicit the goals and strategies to support GQM+Strategies. GSE approach evolved in each iteration and the latest iteration together with the guideline is still used within the studied company for eliciting goals and strategies, and the organization acknowledged that they will continue to do so. Moreover, we conclude that there is a need for further empirical validation of the GSE approach in further full-scale industry trials.

  • 88.
    Ashfaq, Rana Aamir Raza
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Khan, Mohammad Qasim
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Analyzing Common Criteria Shortcomings to Improve its Efficacy2009Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Information security has become a key concern for organizations conducting business in the current electronic era. Rapid technological development continuously creates novel security threats, making IT an uncertain infrastructure. So the security is an important factor for the vendors as well as for the consumers. To fulfill the security needs, IT companies have to adopt some standards to assure some levels that concern with the security in their product. Common Criteria (CC) is one of the standards that maintains and controls the security of IT products. Many other standards are also available to assure the security in products but like these standards CC has its own pros and cons. It does not impose predefined security rules that a product should exhibit but a language for security evaluation. CC has certain advantages due to its ability to address all the three dimensions: a) it provides opportunity for users to specify their security requirements, b) an implementation guide for the developers and c) provides comprehensive criteria to evaluate the security requirements. On the downside, it requires considerable amount of resources and is quite time consuming. Another is security requirements that it evaluates and must be defined before the project start which is in direct conflict with the rapidly changing security threat environment. In this research thesis we will analyze the core issues and find the major causes for the criticism. Many IT users in USA and UK have reservations with CC evaluation because of its limitations. We will analyze the CC shortcomings and document them that will be useful for researchers to have an idea of shortcomings associated with CC. This study will potentially be able to strengthen the CC usage with a more effective and responsive evaluation methodology for IT community.

  • 89.
    Ashraf, Imran
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Khokhar, Amir Shahzed
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Principles for Distributed Databases in Telecom Environment2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Centraliserade databaser blir flaskhals för organisationer som är fysiskt distribuerade och tillgång till data på distans. Datahantering är lätt i centrala databaser. Men bär den höga kostnaden kommunikation och viktigast av hög svarstid. Konceptet att distribuera data över olika orter är mycket attraktiv för sådana organisationer. I sådana fall databasen är splittrade fragment och distribueras till de platser där det behövs. Denna typ av distribution ger lokal kontroll av uppgifter och dataåtkomst är också mycket snabb i dessa databaser. Men, samtidighet kontroll, frågeoptimering och data anslagen är de faktorer som påverkar svarstiden och måste utredas innan genomförandet distribuerade databaser. Denna avhandling gör användningen av blandade metod strategi för att nå sitt mål. I kvantitativa delen utförde vi ett experiment för att jämföra svarstid på två databaser, centraliserad och fragmenterad / distribueras. Försöket utfördes på Ericsson. En litteraturstudie har gjorts för att ta reda på andra viktiga svarstid liknande frågor som frågeoptimering, samtidighet kontroll och data tilldelning. Litteraturgenomgången visade att dessa faktorer ytterligare kan förbättra svarstiden i distribuerad miljö. Resultaten av försöket visade en betydande minskning av den svarstid på grund av splittring och distribution.

  • 90.
    Ataeian, Seyed Mohsen
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Darbandi, Mehrnaz Jaberi
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Analysis of Quality of Experience by applying Fuzzy logic: A study on response time2011Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    To be successful in today's competitive market, service providers should look at user's satisfaction as a critical key. In order to gain a better understanding of customers' expectations, a proper evaluations which considers intrinsic characteristics of perceived quality of service is needed. Due to the subjective nature of quality, the vagueness of human judgment and the uncertainty about the degree of users' linguistic satisfaction, fuzziness is associated with quality of experience. Considering the capability of Fuzzy logic in dealing with imprecision and qualitative knowledge, it would be wise to apply it as a powerful mathematical tool for analyzing the quality of experience (QoE). This thesis proposes a fuzzy procedure to evaluate the quality of experience. In our proposed methodology, we provide a fuzzy relationship between QoE and Quality of Service (QoS) parameters. To identify this fuzzy relationship a new term called Fuzzi ed Opinion Score (FOS) representing a fuzzy quality scale is introduced. A fuzzy data mining method is applied to construct the required number of fuzzy sets. Then, the appropriate membership functions describing fuzzy sets are modeled and compared with each other. The proposed methodology will assist service providers for better decision-making and resource management.

  • 91.
    Awan, Nasir Majeed
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Alvi, Adnan Khadem
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Predicting software test effort in iterative development using a dynamic Bayesian network2010Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    It is important to manage iterative projects in a way to maximize quality and minimize cost. To achieve high quality, accurate project estimates are of high importance. It is challenging to predict the effort that is required to perform test activities in an iterative development. If testers put extra effort in testing then schedule might be delayed, however, if testers spend less effort then quality could be affected. Currently there is no model for test effort prediction in iterative development to overcome such challenges. This paper introduces and validates a dynamic Bayesian network to predict test effort in iterative software development. In this research work, the proposed framework is evaluated in a number of ways: First, the framework behavior is observed by considering different parameters and performing initial validation. Then secondly, the framework is validated by incorporating data from two industrial projects. The accuracy of the results has been verified through different prediction accuracy measurements and statistical tests. The results from the verification confirmed that the framework has the ability to predict test effort in iterative projects accurately.

  • 92. Axelsson, Stefan
    The Normalised Compression Distance as a File Fragment Classifier2010Ingår i: Digital Investigation. The International Journal of Digital Forensics and Incident Response, ISSN 1742-2876, E-ISSN 1873-202X, Vol. 7, nr Suppl 1, s. S24-S31Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    We have applied the generalised and universal distance measure NCD—Normalised Compression Distance—to the problem of determining the type of file fragments. To enable later comparison of the results, the algorithm was applied to fragments of a publicly available corpus of files. The NCD algorithm in conjunction with the k-nearest-neighbour (k ranging from one to ten) as the classification algorithm was applied to a random selection of circa 3000 512-byte file fragments from 28 different file types. This procedure was then repeated ten times. While the overall accuracy of the n-valued classification only improved the prior probability from approximately 3.5% to circa 32%–36%, the classifier reached accuracies of circa 70% for the most successful file types. A prototype of a file fragment classifier was then developed and evaluated on new set of data (from the same corpus). Some circa 3000 fragments were selected at random and the experiment repeated five times. This prototype classifier remained successful at classifying individual file types with accuracies ranging from only slightly lower than 70% for the best class, down to similar accuracies as in the prior experiment.

  • 93. Axelsson, Stefan
    Using Normalized Compression Distance for Classifying File Fragments2010Konferensbidrag (Refereegranskat)
    Abstract [en]

    We have applied the generalised and universal distance measure NCD-Normalised Compression Distance-to the problem of determining the types of file fragments via example. A corpus of files that can be redistributed to other researchers in the field was developed and the NCD algorithm using k-nearest-neighbour as the classification algorithm was applied to a random selection of file fragments. The experiment covered circa 2000 fragments from 17 different file types. While the overall accuracy of the n-valued classification only improved the prior probability of the class from approximately 6% to circa 50% overall, the classifier reached accuracies of 85%-100% for the most successful file types.

  • 94. Axelsson, Stefan
    et al.
    Baca, Dejan
    Feldt, Robert
    Sidlauskas, Darius
    Kacan, Denis
    Detecting Defects with an Interactive Code Review Tool Based on Visualisation and Machine Learning2009Konferensbidrag (Refereegranskat)
    Abstract [en]

    Code review is often suggested as a means of improving code quality. Since humans are poor at repetitive tasks, some form of tool support is valuable. To that end we developed a prototype tool to illustrate the novel idea of applying machine learning (based on Normalised Compression Distance) to the problem of static analysis of source code. Since this tool learns by example, it is rivially programmer adaptable. As machine learning algorithms are notoriously difficult to understand operationally (they are opaque) we applied information visualisation to the results of the learner. In order to validate the approach we applied the prototype to source code from the open-source project Samba and from an industrial, telecom software system. Our results showed that the tool did indeed correctly find and classify problematic sections of code based on training examples.

  • 95.
    Axelsson, Stefan
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Bajwa, Kamran Ali
    Srikanth, Mandhapati Venkata
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    File Fragment Analysis Using Normalized Compression Distance2013Konferensbidrag (Refereegranskat)
    Abstract [en]

    The first step when recovering deleted files using file carving is to identify the file type of a block, also called file fragment analysis. Several researchers have demonstrated the applicability of Kolmogorov complexity methods such as the normalized compression distance (NCD) to this problem. NCD methods compare the results of compressing a pair of data blocks with the compressed concatenation of the pair. One parameter that is required is the compression algorithm to be used. Prior research has identified the NCD compressor properties that yield good performance. However, no studies have focused on its applicability to file fragment analysis. This paper describes the results of experiments on a large corpus of files and file types with different block lengths. The experimental results demonstrate that, in the case of file fragment analysis, compressors with the desired properties do not perform statistically better than compressors with less computational complexity.

  • 96.
    Ayalew, Tigist
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Kidane, Tigist
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Identification and Evaluation of Security Activities in Agile Projects: A Systematic Literature Review and Survey Study2012Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    Context: Today’s software development industry requires high-speed software delivery from the development team. In order to do this, organizations make transformation from their conventional software development method to agile development method while preserving customer satisfaction. Even though this approach is becoming popular development method, from security point of view, it has some disadvantage. Because, this method has several constraints imposed such as lack of a complete overview of a product, higher development pace and lack of documentation. Although security-engineering (SE) process is necessary in order to build secure software, no SE process is developed specifically for agile model. As a result, SE processes that are commonly used in waterfall model are being used in agile models. However, there is a clash or disparity between the established waterfall SE processes and the ideas and methodologies proposed by the agile manifesto. This means that, while agile models work with short development increments that adapt easily to change, the existing SE processes work in plan-driven development setting and try to reduce defects found in a program before the occurrence of threats through heavy and inflexible process. This study aims at bridging the gap in agile model and security by providing insightful understanding of the SE process that are used in the current agile industry. Objectives: The objectives of this thesis are to identify and evaluate security activities from high-profile waterfall SE-process that are used in the current agile industry. Then, to suggest the most compatible and beneficial security activities to agile model based on the study results. Methods: The study involved two approaches: systematic literature review and survey. The systematic literature review has two main aims. The first aim is to gain a comprehensive understanding of security in an agile process model; the second one is to identify high-profile SE processes that are commonly used in waterfall model. Moreover, it helped to compare the thesis result with other previously done works on the area. A survey is conducted to identify and evaluate waterfall security activities that are used in the current agile industry projects. The evaluation criteria were based on the security activity integration cost and benefit provides to agile projects. Results: The results of the systematic review are organized in a tabular form for clear understanding and easy analysis. High-profile SE processes and their activities are obtained. These results are used as an input for the survey study. From the survey study, security activities that are used in the current agile industry are identified. Furthermore, the identified security activities are evaluated in terms of benefit and cost. As a result the best security activities, that are compatible and beneficial, are investigated to agile process model. Conclusions: To develop secure software in agile model, there is a need of SE-process or practice that can address security issues in every phase of the agile project lifecycle. This can be done either by integrating the most compatible and beneficial security activities from waterfall SE processes with agile process or by creating new SE-process. In this thesis, it has been found that, from the investigated high-profile waterfall SE processes, none of the SE processes was fully compatible and beneficial to agile projects.

  • 97.
    Ayichiluhm, Theodros
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Mohan, Vivek
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    IPv6 Monitoring and Flow Detection2013Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    IPv6 Privacy extensions, implemented in major operating systems, hide user’s identity by using a temporary and a randomly generated IPv6 addresses rather than using the former, EUI-64 format where the MAC address is part of the IPv6 address. This solution for privacy has created a problem for network administrators to back-trace an IPv6 address to a specific MAC address, since the temporary IP address used once by the node is removed from the interface after a period of time. An IPv6 Ethernet test bed is setup to investigate IPv6 implementation dynamics in Windows 7 and Ubuntu10.04 operating systems. The testbed is extended to investigate the effects of temporary IPv6 addresses due to IPv6 privacy extensions on the on-going sessions of different applications including ping, File Transfer Protocol (FTP) and video streaming (HTTP and RTP). On the basis of the knowledge obtained from investigations about dynamics of IPv6 privacy extensions, this work proposes Internet Protocol version 6 Host Tracking (IPv6HoT), a web based IPv6 to MAC mapping solution. IPv6HoT uses Simple Network Management Protocol (SNMP) to forward IPv6 Neighbor table from routers to Network Management Stations (NMS). This thesis work provides guidelines for configuring IPv6 privacy extensions in Ubuntu10.04 and Windows 7; the difference of implementation between these two operating systems is also presented in this work. The results show that temporary IPv6 addressing has a definite effect on the on-going sessions of video streaming and FTP applications. Applications running as server on Temporary IPv6 address encountered more frequent on-going session interruptions than applications running as a server over public IPv6 address. When temporary IPv6 addresses were configured to host FTP and video streaming applications, their on-going sessions were permanently interrupted. It is also observed that LFTP, a client FTP application, resumes an interrupted session.

  • 98.
    Ayub, Yasir
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Faruki, Usman
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Container Terminal Operations Modeling through Multi agent based Simulation2009Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    Simulation is a good technique that help analyst to take decision considering each factor of problem that is to be simulated. Simulation in addition with multi agent environment provides better understanding of modeling the entities. The complexity of CT environment and multiple involvement of agents simultaneously enables CT suitable domain for multi agents. We have modeled the four CT operations that are carried out at each CT. These operations are modeled in hierarchical sequence like berth allocation, QCs allocation, Transport vehicles allocation and YCs allocation. The most important factor of simulation is the measurement of dynamic time of each operation. We have simulated and compared different agents active time and service time compared with associated cost. The berth allocation is very important asset from all the operations that are carried out at the CTs. The effective utilization of FCFS berth allocation policy reduces the vessel waiting time in waiting queue. The developed terminal simulator tool allocates all resources dynamically while looking at the number of containers that will be loaded and unloaded at QS and yard storage area. The result of simulation tool presents the good dynamic allocation of transport vehicles. The dynamic resource allocation helps to minimize the congestion and bottlenecks that may occur at CTs. The result of three experiments depicts that the berth allocation and agent allocation is improved and vessel service time is reduced at berth side which automatically reduces vessels waiting time in queue. Besides this the transport allocation and YCs allocation is dynamically assigned by looking at number of containers in the vessels. The terminal simulator helps managers to analyze the simulated results and take better decision at hand.

  • 99.
    Azam, Muhammad
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Ahmad, Luqman
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    A Comparative Evaluation of Usability for the iPhone and iPad2011Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [sv]

    The aim of this study was to evaluate and measure the comparative usability performances of the iPhone an iPad. The preliminary focus of the study was to nd the usability issues regarding both the devices. Several dierent methods have been used in the study when investigating the answers of predened research questions. The study in this thesis was a literature review, a survey study and an empirical usability testing experiment. In the literature review, the authors studied the dierent usability issues such as inconsistency in applications, applications crashes and accidental errors.The detail list of the issues also presented in Table 2. However, the survey responses validated such issues and highlighted some additional issues such as low battery time, Blue-tooth connectivity problems, wireless connectivity problem, week signal strength and missing help. Two participants commented on the missing of a swyping feature. According to them if the iPhone and iPad had such feature then their typing performance should be improved. A survey study was conducted to evaluate and measure the significance in the four main parts of the devices i.e. system, touch screen, keypad and applications. Each part contained the multiple related statements that explored the different usability aspects. The detail and results of each statement are given in Table 4. After the statistical analysis, the authors did not nd a signicant dierence between the statements regarding the iPhone and iPad apart from in four statements. In the two statements (i.e. nding a new application in the Apple store and replacing the contents location on the interface) the users agreed that the iPad performance was better than the iPhone. However, in the other two statements (i.e. use of system in the sun light and zooming gestures) they preferred the iPhone rather than the iPad. Its means 90.47% users addressed the same issues in the iPhone and iPad. There is no dierence in their preferences regarding the iPhone and iPad. 4.76% users preferred the iPhone and 4.46% users preferred the iPad in above mentioned conditions. The empirical usability testing experiment focused on studying the usability performances of the iPhone and iPad considering three target groups. The performances comparisons were conducted by comparing the dierent participants groups and both devices. In total 60 users participated in the empirical usability performance testing study. The selection of the participants was based on their earlier experience with mobile phones usage. The detail of the participant are given in Figure 13. The comparisons results of the participant groups are shown in Table 7, the devices in Table 8 and the errors on both devices in Table 9 provided the signicance in the performances. The novice users versus experienced users comparison results across the iPhone (all six tasks) and the iPad (rst ve tasks) presented that the experienced users performed faster than the novice users with lower error rate. The results of the novice users versus elderly user comparison presented that the novice user using the iPhone ( first four tasks) and using the iPad (facebook login, location close view, new note and Evernote logout) performed faster than the elderly users with lower error rate. Similarly the results in the last comparison of the experienced users versus elderly users shown that the experienced users performed all the six tasks faster than the elderly users with lower error rate. Based on such nding the authors can conclude that the experienced users performances were better than the other two groups on both devices. The authors can also conclude that if someone is using the iPhone he/she could easily use the iPad. In the devices comparison results, the performances of the iPad across the novice users (in tasks facebook login, send a message, location close view, new note Evernote logout), the experienced users (in tasks facebook login, send a message, location close view and new note) and the elderly users (st ve tasks) were better than the iPhone with lower error rate. Base on such ndings of the empirical usability testing, the authors conclude that in the controlled environment the performance of the iPad was better than the iPhone. Three participant groups performed each task faster on the iPad than the iPhone apart from in one task. The task named location identication the iPhone and iPad performed the same across the novice user and experienced user but across the elderly user, the iPad performed better than the iPhone. Table 10 presents the results of the satisfaction level of the participants in a control environment for both devices. The satisfaction level acquired through the post test questionnaire during the experiment. The results of the novice users and the experienced users presented that their satisfaction levels were higher for the iPad than the iPhone. The experienced users results shown that their satisfaction levels were same for the iPhone as well as for the iPad. However, with the keypads performances they looked more satisfied on the iPad than the iPhone.

  • 100.
    Azam, Muhammad
    et al.
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    Hussain, Izhar
    Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation.
    The Role of Interoperability in eHealth2009Självständigt arbete på avancerad nivå (masterexamen)Studentuppsats (Examensarbete)
    Abstract [en]

    In the light of challenges the lack of interoperability in systems and services has long been recognized as one of the major challenge to the wider implementation of the eHealth applications. The opportunities and positive benefits of achieving interoperability are eventually considerable, whereas various barriers and challenges act as impediments. The purpose of this study was to investigate the interoperability among different health care organizations. The knowledge of this study would be supportive to health care organizations to understand the interoperability problems in health care organizations. In the first phase of literature review interoperability challenges in Sweden and other EU countries were identified. On the basis of findings interviews were conducted to know the strategies and planning about interoperability in health care organizations. After analysis of interviews, questionnaires were conducted to know the opinions of different medical IT administrator and health professionals. The authors find after the analysis of interviews and questionnaire that adopting eHealth standard, same system, insuring the security of patient’s health record information and same medical language could be implemented in Sweden and other EU countries health organizations.

1234567 51 - 100 av 1344
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf