Change search
Refine search result
1234567 51 - 100 of 1407
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 51.
    Appuni, Bala Satish
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Vemasani, Vamsi Krishna
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Performance Evaluation of Power Control Algorithms in Cellular Radio Communication Systems2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Radio resources in wireless communication systems, implementing different multiple access techniques, must be wisely managed. This perspective is pivotal since the variations in propagation channel are very fast and the system is highly complex due to random and unpredictable movement of mobile users continuously. This complexity in the cellular system periodically contributes to different interference levels, high or low, resulting in the degradation of the system capacity. Transmitter power control is an efficient technique to mitigate the effect of interference under fading conditions, combat the Near-Far problem and conserve the battery life. Thus, an effective implementation of different power control algorithms in cellular radio communication systems can offer a significant improvement in the Quality of Service (QoS) to all the users. Choice of an appropriate power control algorithm is of prime importance, as it should aim at increasing the overall efficiency of the system. In this thesis different distributed power control algorithms, each suited for implementation under different cellular technologies, were studied extensively. Specifically, six distributed power control algorithms are compared through simulations on the basis of performance metrics like Carrier to Interference Ratio (CIR) and Outage for the downlink case. The work involves in finding the link gain matrix by modeling the cellular system in MATLAB and simulating different power control algorithms. The results obtained from the simulation work are used to evaluate the efficiency of the Distributed Power Control (DPC), Fully Distributed Power Control (FDPC), Improved Fully Distributed Power Control (FDPC+) and Balanced Distributed Power Control (BDPC) algorithms on the basis of convergence speed and at the same time evaluating the limitations of the different algorithms. Also, with the results obtained on the basis of outage comparison between Fixed Step Power Control (FSPC) and Augmented Constant Improvement Power Control (ACIPC) algorithms, the quality of active link protection and cell removal procedures are demonstrated.

  • 52.
    Arenas, Miguel Tames
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Social Engineering and Internal Threats in Organizations2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Organizations are taking computer security more seriously every day, investing huge amounts of money in creating stronger defenses including firewalls, anti-virus software, biometrics and identity access badges. These measures have made the business world more effective at blocking threats from the outside, and made it increasingly difficult for hackers or viruses to penetrate systems. But there are still threats that put organizations at risk , this threats are not necessary from external attackers, in this paper we will analyze what are the internal threats in organizations, why are we vulnerable and the best methods to protect our organizations from inside threats.

  • 53.
    Areskoug, Andreas
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Jämförelse av J2EE och .NET från ett Web Services perspektiv.2006Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This thesis compares the performance of Web Services when hosted on either the J2EE or the .NET platform. The thesis will investigate which platform should be choosen to host Web Services mainly based on performance.

  • 54.
    Arikenbi, Temitayo
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Decision Support for Multi-Criteria Energy Generation Problem2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In this study, an attempt is made to apply Decision Support Systems (DSS) in planning for the expansion of energy generation infrastructure in Nigeria. There is an increasing demand for energy in that country, and the study will try to show that DSS modelling, using A Mathematical Programming Language (AMPL) as the modelling tool, can offer satisficing results which would be a good decision support resource for motivating how to expend investment for energy generation.

  • 55.
    Arlos, Patrik
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    On the Quality of Computer Network Measurements2005Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Due to the complex diversity of contemporary Internet-services, computer network measurements have gained considerable interest during recent years. Since they supply network research, development and operations with data important for network traffic modelling, performance and trend analysis, etc. The quality of these measurements affect the results of these activities and thus the perception of the network and its services. This thesis contains a systematic investigation of computer network measurements and a comprehensive overview of factors influencing the quality of performance parameters obtained from computer network measurements. This is done using a novel network performance framework consisting of four modules: Generation, Measurement, Analysis and Visualization. These modules cover all major aspects controlling the quality of computer network measurements and thus the validity of all kinds of conclusions based on them. One major source of error is the timestamp accuracy obtained from measurement hardware and software. Therefore, a method is presented that estimates the timestamp accuracy obtained from measurement hardware and software. The method has been used to evaluate the timestamp accuracy of some commonly used hardware (Agilent J6800/J6830A and Endace DAG 3.5E) and software (Packet Capture Library). Furthermore, the influence of analysis on the quality of performance parameters is discussed. An example demonstrates how the quality of a performance metric (bitrate) is affected by different measurement tools and analysis methods. The thesis also contains performance evaluations of traffic generators, how accurately application-level measurements describe network behaviour, and of the quality of performance parameters obtained from PING and J-OWAMP. The major conclusion is that measurement systems and tools must be calibrated, verified and validated for the task of interest before using them for computer network measurements. A guideline is presented on how to obtain performance parameters at a desired quality level.

  • 56.
    Arlos, Patrik
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Fiedler, Markus
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    A Method to Estimate the Timestamp Accuracy of Measurement Hardware and Software Tools2007Conference paper (Refereed)
    Abstract [en]

    Due to the complex diversity of contemporary Internet applications, computer network measurements have gained considerable interest during the recent years. Since they supply network research, development and operations with data important for network traffic modelling, performance and trend analysis etc., the quality of these measurements affect the results of these activities and thus the perception of the network and its services. One major source of error is the timestamp accuracy obtained from measurement hardware and software. On this background, we present a method that can estimate the timestamp accuracy obtained from measurement hardware and software. The method is used to evaluate the timestamp accuracy of some commonly used measurement hardware and software. Results are presented for the Agilent J6800/J6830A measurement system, the Endace DAG 3.5E card, the Packet Capture Library (PCAP) either with PF_RING or Memory Mapping, and a RAW socket using either the kernel PDU timestamp (ioctl) or the CPU counter (TSC) to obtain timestamps.

  • 57. Arlos, Patrik
    et al.
    Fiedler, Markus
    Nilsson, Arne A.
    A Distributed Passive Measurement Infrastructure2005Conference paper (Refereed)
    Abstract [en]

    In this paper we describe a distributed passive measurement infrastructure. Its goals are to reduce the cost and configuration effort per measurement. The infrastructure is scalable with regards to link speeds and measurement locations. A prototype is currently deployed at our university and a demo is online at http://inga.its.bth.se/projects/dpmi. The infrastructure differentiates between measurements and the analysis of measurements, this way the actual measurement equipment can focus on the practical issues of packet measurements. By using a modular approach the infrastructure can handle many different capturing devices. The infrastructure can also deal with the security and privacy aspects that might arise during measurements.

  • 58.
    Asim, Muhammad Ahsan
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Network Testing in a Testbed Simulator using Combinatorial Structures2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This report covers one of the most demanding issues of network users i.e. network testing. Network testing in this study is about performance evaluation of networks, by putting traffic load gradually to determine the queuing delay for different traffics. Testing of such operations is becoming complex and necessary due to use of real time applications such as voice and video traffic, parallel to elastic data of ordinary applications over WAN links. Huge size elastic data occupies almost 80% resources and causes delay for time sensitive traffic. Performance parameters like service outage, delay, packet loss and jitter are tested to assure the reliability factor of provided Quality of Service (QoS) in the Service Level Agreements (SLAs). Normally these network services are tested after deployment of physical networks. In this case most of the time customers have to experience unavailability (outage) of network services due to increased levels of load and stress. According to user-centric point of view these outages are violation and must be avoided by the net-centric end. In order to meet these challenges network SLAs are tested on simulators in lab environment. This study provides a solution for this problem in a form of testbed simulator named Combinatorial TestBed Simulator (CTBS). Prototype of this simulator is developed for conducting experiment. It provides a systematic approach of combinatorial structures for finding such traffic patterns that exceeds the limit of queuing delay, committed in SLAs. Combinatorics is a branch of mathematics that deals with discrete and normally finite elements. In the design of CTBS, technique of combinatorics is used to generate a variety of test data that cannot be generated manually for testing the given network scenario. To validate the design of CTBS, results obtained by pilot runs are compared with the results calculated using timeline. After validation of CTBS design, actual experiment is conducted to determine the set of traffic patterns that exceeds the threshold value of queuing delay for Voice over Internet Protocol (VOIP) traffic.

  • 59.
    Aslam, Khurum
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Khurum, Mahvish
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    A Model for Early Requirements Triage and Selection Utilizing Product Strategies2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In market-driven product development, large numbers of requirements flow in continuously. It is critical for product management to select the requirements aligned with overall business goals and discard others as early as possible. It has been suggested in literature to utilize product strategies for early requirements triage and selection. However, no explicit method/model/framework has been suggested as how to do it. This thesis presents a model for early requirements triage and selection utilizing product strategies based on a literature study and interviews with people at two organizations about the requirements triage and selection processes and product strategies formulation. The model is validated statically within the same two organizations.

  • 60. Aspvall, Bengt
    et al.
    Pettersson, Eva
    Från datorernas värld2007In: Nämnaren, ISSN 0348-2723 , Vol. 34, no 2, p. 44-48Article in journal (Refereed)
  • 61.
    Asteborg, Marcus
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Svanberg, Niklas
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Implementation Considerations for Active Noise Control in Ventilation Systems2006Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    The most common method to attenuate noise in ventilation systems today is passive silencers. For these to efficiently attenuate frequencies below 400 Hz such silencers need to be large and a more neat solution to attenuate low frequencies is to use active noise control (ANC). The usage of ANC in ventilation systems is well known and there are several commercial products available. ANC is not, however, used on a wide basis due to its often high price and poor performance. Since the price is an important factor in ANC systems the expensive laboratory filters and the amplifier that is currently used in the experimental setup at Blekinge Institute of Technology (BTH) need to be replaced with cheaper ones, but without too much performance loss. For easier implementation in ventilation systems the placement of the reference microphone is important, the shorter distance from the anti-noise loud speaker the easier the ANC system is to implement. But if the distance is so small that the ANC system is no longer causal the performance will be decreased and if the reference microphone is close enough to pick up acoustic feedback from the anti-noise loud speaker the performance will also be decreased. In this thesis the expensive laboratory filters will be exchanged to cheaper alternatives, power and total harmonic distortion (THD) measurements will be done on the amplifier that is driving the loud speaker and the reference microphones position will be investigated with measurements on the group delay of the system and the acoustic feedback between the loud speaker and the reference microphone.

  • 62. Aurum, Aybüke
    et al.
    Wohlin, Claes
    A Value-Based Approach in Requirements Engineering: Explaining Some of the Fundamental Concepts2007Conference paper (Refereed)
  • 63. Aurum, Aybüke
    et al.
    Wohlin, Claes
    Aligning Requirements with Business Objectives: A Framework for Requirements Engineering Decisions2005Conference paper (Refereed)
    Abstract [en]

    As software development continues to increase in complexity, involving far-reaching consequences, there is a need for decision support to improve the decision making process in requirements engineering (RE) activities. This research begins with a detailed investigation of the complexity of decision making during RE activities on organizational, product and project levels. Secondly, it presents a conceptual model which describes the RE decision making environment in terms of stakeholders, information requirements, decision types and business objectives. The purpose of this model is to facilitate the development of decision support systems in RE and to help further structure and analyse the decision making process in RE.

  • 64. Aurum, Aybüke
    et al.
    Wohlin, Claes
    Petersson, Håkan
    Increasing the Understanding of Effectiveness in Software Inspections Using Published Data Sets2005In: Journal of Research and Practice in Information Technology, ISSN 1443-458X , Vol. 37, no 3, p. 51-64Article in journal (Refereed)
    Abstract [en]

    Since its inception into software engineering software inspection has been viewed as a cost-effective way of increasing software quality. Despite this many questions remain unanswered regarding, for example, ideal team size or cost effectiveness. This paper addresses some of these questions by performing an analysis using 30 published data sets from empirical experiments of software inspections. The main question is concerned with determining a suitable team size for software inspections. The effectiveness of different team sizes is also studied. Furthermore, the differences in mean effectiveness between different team sizes are investigated based on the inspection environmental context, document types and reading technique. It is concluded that it is possible to choose a suitable team size based on the effectiveness of inspections. This can be used as a tool to assist in the planning of inspections. A particularly interesting result is that variation in the effectiveness between different teams is considerably higher for certain types of documents than for others. Our findings contain important information for anyone planning, controlling or managing software inspections.

  • 65. Aurum, Aybüke
    et al.
    Wohlin, Claes
    Porter, A.
    Aligning Software Project Decisions: A Case Study2006In: International Journal of Software Engineering and Knowledge Engineering, ISSN 0218-1940 , Vol. 16, no 6, p. 795-818Article in journal (Refereed)
  • 66.
    Awan, Rashid
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Requirements Engineering Process Maturity Model for Market Driven Projects: The REPM-M Model2005Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Several software projects are over budgeted or have to face failures during operations. One big reason of this is Software Company develops wrong software due to wrong interpretation of requirements. Requirements engineering is one of the well known discipline within Software engineering which deals with this problem. RE is the process of eliciting, analyzing and specifying requirements so that there won’t be any ambiguity between the development company and the customers. Another emerging discipline within requirements engineering is requirements engineering for market driven projects. It deals with the requirements engineering of a product targeting a mass market. In this thesis, a maturity model is developed which can be used to assess the maturity of requirements engineering process for market driven projects. The objective of this model is to provide a quick assessment tool through which a company would be able to know what are the strengths and weaknesses of their requirements engineering process.

  • 67.
    Awan, Zafar Iqbal
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Azim, Abdul
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Network Emulation, Pattern Based Traffic Shaping and KauNET Evaluation2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Quality of Service is major factor for a successful business in modern and future network services. A minimum level of services is assured indulging quality of Experience for modern real time communication introducing user satisfaction with perceived service quality. Traffic engineering can be applied to provide better services to maintain or enhance user satisfaction through reactive and preventive traffic control mechanisms. Preventive traffic control can be more effective to manage the network resources through admission control, scheduling, policing and traffic shaping mechanisms maintaining a minimum level before it get worse and affect user perception. Accuracy, dynamicity, uniformity and reproducibility are objectives of vast research in network traffic. Real time tests, simulation and network emulation are applied to test uniformity, accuracy, reproducibility and dynamicity. Network Emulation is performed over experimental network to test real time application, protocol and traffic parameters. DummyNet is a network emulator and traffic shaper which allows nondeterministic placement of packet losses, delays and bandwidth changes. KauNet shaper is a network emulator which creates traffic patterns and applies these patterns for exact deterministic placement of bit-errors, packet losses, delay changes and bandwidth changes. An evaluation of KauNet with different patterns for packet losses, delay changes and bandwidth changes on emulated environment is part of this work. The main motivation for this work is to check the possibility to delay and drop the packets of a transfer/session in the same way as it has happened before (during the observation period). This goal is achieved to some extent using KauNet but some issues with pattern repetitions are still needed to be solved to get better results. The idea of history and trace-based traffic shaping using KauNet is given to make this possibility a reality.

  • 68.
    AWONIYI, OLUWASEYI
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    STRATOSHPHERIC CHANNEL MODELLING2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    High Altitude Platform Stations (HAPs) are communication facilities situated at an altitude of 17 to 30 km and at a specified, nominal, fixed point relative to the Earth. They are mostly solar-powered, unmanned, and remotely-operated. These platforms have the capability of carrying multipurpose communications relay payload, which could be in the form of full base station or, in some cases, a simple transponder as is being used in satellite communication systems. HAPs, when fully deployed will have the capability of providing services and applications ranging from broadband wireless access, navigation and positioning systems, remote-sensing and weather observation/monitoring systems, future generation mobile telephony etc. HAPs are also known to be low cost when it comes to its implementation and are expected to be the next big provider of infrastructure for wireless communications. There have been a lot of ongoing and exciting research works into various aspects of this emergent technology. As radio Engineers, the need to predict the channel quality and analyze the performance evaluation of such stratospheric propagation has generated quite a few models. Although some of the models under consideration are from the existing terrestrial and satellite communications which in some way, have some relationships with this new technology. This thesis work provides some insight into this new aspect of wireless communications in terms of the need for a new system, its benefits, challenges services provided and applications supported. Existing models already researched and developed for HAPS are reviewed; one of them was picked and deeply looked into as regards the propagation and channel efficiency. The analysis of the choice model is presented using one of the performance test for channel models, the bit error rate (BER).

  • 69.
    Axelsson, Elinor
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Ip-telefoni med Skype som ett alternativ till PSTN för privatanvändare2007Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Arbetet är en praktisk och teoretisk test av IP-telefoni med Skype som ligger till grund för en jämförelse mot telefoni med PSTN (Public Switched Telephone Network) som är den vanliga telefonstandard de flesta av oss använder idag. Syftet med arbetet är att underlätta valet mellan PSTN och IP-telefoni för privatanvändare i Sverige. Arbetet är tänkt att svara på följande frågeställningar. - Hur enkelt är det att komma igång med IP-telefoni via Skype? - Hur är kvalitén på IP-telefonisamtal jämfört med PSTN? - Fungerar alla de tjänster man har med PSTN även med IP-telefoni? - Hur är användbarheten och tillgängligheten till hjälp och support med IP-telefonin? - Är det billigare att ringa med IP-telefoni och i så fall under vilka förutsättningar? I arbetet har en samling praktiska och teoretiska undersökningar genomförts för att kunna bedöma IP-telefonin med Skype inom följande bedömningsområden. Installation, funktion, kvalitet, användbarhet, kostnader, tillgänglighet och säkerhet. Till undersökningen av användbarhet har en testgrupp på 10 personer använts för att utvärdera systemets användbarhet. En praktisk test av Skypeklientens funktion och kvalitet har utförts genom ett antal provringningar. Skypelösningens tillgänglighet, kostnader och säkerhet har studerats i relevant litteratur och genom fakta på Internet. Resultaten av undersökningen visar att Skypelösningen fungerar lika bra som PSTN med avseende på funktion och kvalitet men det krävs en viss datorvana för att installera och använda lösningen vilket har en viss negativ inverkan på användbarheten. Prismässigt lönar det sig bara för de som ringer mycket utomlands, för övriga användare blir det oftast betydligt dyrare än telefoni med PSTN. Skype själva informerar tydligt om att det inte garanterar funktionen för nödsamtal vilket är en stor nackdel om man vill ersätta sin PSTN-telefon med Skypelösningen. På grund av ovanstående argument så är IP-telefoni med Skype för de flesta användare inte ett bra alternativ till PSTN.

  • 70.
    Axelsson, Mattias
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Sonesson, Johan
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Business Process Performance Measurement for Rollout Success2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Business process improvement for increased product quality is of continuous importance in the software industry. Quality managers in this sector need effective, hands-on tools for decision-making in engineering projects and for rapidly spotting key improvement areas. Measurement programs are a widespread approach for introducing quality improvement in software processes, yet employing all-embracing state-of-the art quality assurance models is labor intensive. Unfortunately, these do not primarily focus on measures, revealing a need for an instant and straightforward technique for identifying and defining measures in projects without resources or need for entire measurement programs. This thesis explores and compares prevailing quality assurance models using measures, rendering the Measurement Discovery Process constructed from selected parts of the PSM and GQM techniques. The composed process is applied to an industrial project with the given prerequisites, providing a set of measures that are subsequently evaluated. In addition, the application gives foundation for analysis of the Measurement Discovery Process. The application and analysis of the process show its general applicability to projects with similar constraints as well as the importance of formal target processes and exhaustive project domain knowledge among measurement implementers. Even though the Measurement Discovery Process is subject to future refinement, it is clearly a step towards rapid delivery of tangible business performance indicators for process improvement.

  • 71.
    Ayoubi, Tarek
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Distributed Data Management Supporting Healthcare Workflow from Patients’ Point of View2007Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Patient’s mobility throughout his lifetime leaves a trial of information scattered in laboratories, clinical institutes, primary care units, and other hospitals. Hence, the medical history of a patient is valuable when subjected to special healthcare units or undergoes home-care/personal-care in elderly stage cases. Despite the rhetoric about patient-centred care, few attempts were made to measure and improve in this arena. In this thesis, we will describe and implement a high-level view of a Patient Centric information management, deploying at a preliminary stage, the use of Agent Technologies and Grid Computing. Thus, developing and proposing an infrastructure that allows us to monitor and survey the patient, from the doctor’s point of view, and investigate a Persona, from the patients’ side, that functions and collaborates among different medical information structures. The Persona will attempt to interconnect all the major agents (human and software), and realize a distributed grid info-structure that directly affect the patient, therefore, revealing an adequate and cost-effective solution for most critical information needs. The results comprehended in the literature survey, consolidating Healthcare Information Management with emerged intelligent Multi-Agent System Technologies (MAS) and Grid Computing; intends to provide a solid basis for further advancements and assessments in this field, by bridging and proposing a framework between the home-care sector and the flexible agent architecture throughout the healthcare domain.

  • 72. Baca, Dejan
    Automated static code analysis: A tool for early vulnerability detection2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software vulnerabilities are added into programs during its development. Architectural flaws are introduced during planning and design, while implementation faults are created during coding. Penetration testing is often used to detect these vulnerabilities. This approach is expensive because it is performed late in development and any correction would increase lead-time. An alternative would be to detect and correct vulnerabilities in the phase of development where they are the least expensive to correct and detect. Source code audits have often been suggested and used to detect implementations vulnerabilities. However, manual audits are time consuming and require extended expertise to be efficient. A static code analysis tool could achieve the same results as a manual audit but at fraction of the time. Through a set of cases studies and experiments at Ericsson AB, this thesis investigates the technical capabilities and limitations of using a static analysis tool as an early vulnerability detector. The investigation is extended to studying the human factor by examining how the developers interact and use the static analysis tool. The contributions of this thesis include the identification of the tools capabilities so that further security improvements can focus on other types of vulnerabilities. By using static analysis early in development possible cost saving measures are identified. Additionally, the thesis presents the limitations of static code analysis. The most important limitation being the incorrect warnings that are reported by static analysis tools. In addition, a development process overhead was deemed necessary to successfully use static analysis in an industry setting.

  • 73. Baca, Dejan
    et al.
    Carlsson, Bengt
    Lundberg, Lars
    Evaluating the Cost Reduction of Static Code Analysis for Software Security2008Conference paper (Refereed)
    Abstract [en]

    Automated static code analysis is an efficient technique to increase the quality of software during early development. This paper presents a case study in which mature software with known vul-nerabilities is subjected to a static analysis tool. The value of the tool is estimated based on reported failures from customers. An average of 17% cost savings would have been possible if the static analysis tool was used. The tool also had a 30% success rate in detecting known vulnerabilities and at the same time found 59 new vulnerabilities in the three examined products.

  • 74. Bai, Guohua
    A Sustainable Information System for e-home Services2004Conference paper (Refereed)
    Abstract [en]

    E-home related home-services (including homecare and home healthcare) in China is urgently needed. The population of aged people over 80 is increasing 5% every year in China, and to year 2050, one fourth of whole population or 0.4 billions people in China are aged staying at home. Meanwhile the government cannot afford with a national elderly care system like most western countries as Sweden. This is because China has had one-child/one-family policy since 1970’s, and this radical policy has made China step in aged society very quickly within only 20 years, while the same process took 40-80 years in western countries. Even worse, China becomes aged society when the country is still poor and under developing with GDP per capita less than 1000$, comparing to western countries with 5000 – 10000 $ when they became aged society. E-home provides China with a unique, and maybe the most effective solution to the problem. By applying effective IT&C at home, elder people are facilitated to manage their own daily life. If needed, they can always call help from their collective service centre that is located in their resident area and the collective service centre can provide with both homecare (cleaning, shopping, reparation, baby care etc.) and home healthcare (legitimate medical care). Elder people can be also monitored (if wished by all partners) both at home and out door by bearing sensors that can send singles directly to related care providers (including their children and relatives if wished). E-home will greatly increase the security of elder people, release great worry from both their children and elder people themselves, and can be afford by most people. However, e-home is more than just a technical problem, and it needs a systemic way and social-psychological study how to design e-home system. In the end, e-home system must provide with needed services to residents. I will introduce IMIS project ´Integrated Mobile Information System for Home Healthcare’ financed by Swedish Agency for Innovative Systems (VINNOVA). This project will continue to 2006, and one of the outputs will be a sustainable software platform which is based on a systemic study of social psychological factors involved in the home healthcare. I will provide with some Swedish experiences and the so called ‘Scandinavia Approach’ in conducting such complex system to my colleagues in China, and I hope the IMIS project will be also developed in China based up on some feasibility and desirability studies with some Chinese colleagues.

  • 75. Bai, Guohua
    Integrerade Nätverk i Hemsjukvården För Personer Med Diabetes (IMIS)2005Conference paper (Refereed)
    Abstract [en]

    The presentation is a IMIS project report to VINNOVA's programkonferens IT i Hemsjukvård.

  • 76. Bai, Guohua
    Integrerade nätverk i hemsjukvården för personer med diabetes (IMIS)2007Report (Other academic)
    Abstract [sv]

    IT för sjukvård i hemmet är ett program inom VINNOVAs enhet Tjänster och IT användning.

  • 77. Bai, Guohua
    et al.
    Malmqvist, Gustav
    Guide to REgional Good Practice eHealth2007Report (Other academic)
    Abstract [en]

    This report shows the result of the work of IANIS+ eHealth work group (WG). The WG has collected regional eHealth experiences from around Europe through a number of activities: • Regional eHealth case studies of which 17 (from 15 regions) are shown in this report • Four joint meetings of the group of which one was a policy seminar with invited guests from the EU Commission, relevant organisations in the field of eHealth and regional authorities • A meeting with the European Commission DG Information Society & Media, Unit H1 eHealth • Collaboration with the eHealth network within the organisation Assembly of European Regions (AER) • Attendance in recent major eHealth conferences: · Personal Health Systems arranged by the European Commission when launching the eHealth part of the 7th Framework Programme, 11-12 February 2007 · The EU-US eHealth Policy Workshop, 10 May 2007 · The final conference of the INTERREG IIIB project Baltic eHealth, 21-22 May 2007 • eHealth seminars at IANIS+ annual conferences in Blekinge 2006 and Bilbao 2007 The innovation perspective of eHealth in the regions has been the focus for the IANIS+ eHealth WG. Regional diversity regarding strategies, policies, and action plans for eHealth can act as a driving factor for successful eHealth projects, but leads also to challenges for interoperability, standardisation, integrity and security. It is important to learn from others. It may be about how to choose the right technology or what methods to use for implementation. Depending on what area of eHealth, there are numerous projects and up-and-running services from which we can learn. Not to forget there are also many experiences from unsuccessful trials. Even if an eHealth solution has failed in one setting, it can be a success under different circumstances. The aim of the IANIS+ eHealth Working Group was to share experience between regions belonging to the network, and bring up some issues of good practice for regional eHealth implementation. Projects brought up in the IANIS+ working group are projects in there own rights, with pros and cons. The projects cover different perspectives and types of eHealth. Some were difficult to evaluate while others are valuable comparable experiences from different settings and circumstances. In any case, we can learn something from all the cases as examples from reality and as a complement to formal evaluations and scientific studies of eHealth. We would rather use the term good practice than best practice. There is always something good to learn from others while there is hardly any best practice that works under every circumstance.

  • 78. Bai, Guohua
    et al.
    Zhang, Peng
    Developing a Semantic Web Services for Interoperability2005Conference paper (Refereed)
    Abstract [en]

    This paper deals with the design problems in healthcare information systems on how to adapt to dynamic factors and diversities so as to reach the interoperability between healthcare actors. Those dynamic factors and diversities are caused by, for example, constant suggestions from users, changes in the applied technologies, and after-learning related stepwise progresses. We suggest applying Web services technology, especially the idea of Semantic Web to tackle the problem of diversity in healthcare branch. In order to tackle the problem of dynamic changes during the design process, we suggest an evolutionary design methodology based upon an organic development metaphor – embryonic prototyping approach (EmA). We demonstrate our idea by the project IMIS that is to design a semantic Web service for the diabetic healthcare.

  • 79. Bai, Guohua
    et al.
    Zhang, Peng
    Functions For eHealth Communication Systems Design2009Report (Other academic)
    Abstract [en]

    The design of eHealth systems calls for a clear needs analysis and functions based on the users’ perspective. However, this is often the most difficult part of designing eHealth service systems, because of the diversity of users and diversified needs. This paper provides a general mapping and overview of the basic functions of eHealth systems identified through two studies: (1) a comprehensive study of the resource book of eHealth projects under the European Sixth Framework programme (FP6), and (2) a theoretical study based on the framework of Activity Theory. The first study provides the eHealth system designers with an overview of a number of important and particular functions that the users/citizens need in order to get good services; while the second study presents a fundamental ‘skeleton’ and architecture for providing those particular functions in an orderly and coherent system.

  • 80.
    Baig, Imran
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Measuring Cohesion and Coupling of Object-Oriented Systems Derivation and Mutual Study of Cohesion and Coupling2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Cohesion and coupling are considered amongst the most important properties to evaluate the quality of a design. In the context of OO software development, cohesion means relatedness of the public functionality of a class whereas coupling stands for the degree of dependence of a class on other classes in OO system. In this thesis, a new metric has been proposed that measures the class cohesion on the basis of relative relatedness of the public methods to the overall public functionality of a class. The proposed metric for class cohesion uses a new concept of subset tree to determine relative relatedness of the public methods to the overall public functionality of a class. A set of metrics has been proposed for measuring class coupling based on three types of UML relationships, namely association, inheritance and dependency. The reasonable metrics to measure cohesion and coupling are supposed to share the same set of input data. Sharing of input data by the metrics encourages the idea for the existence of mutual relationships between them. Based on potential relationships research questions have been formed. An attempt is made to find answers of these questions with the help of an experiment on OO system FileZilla. Mutual relationships between class cohesion and class coupling have been analyzed statistically while considering OO metrics for size and reuse. Relationships among the pairs of metrics have been discussed and results are drawn in accordance with observed correlation coefficients. A study on Software evolution with the help of class cohesion and class coupling metrics has also been performed and observed trends have been analyzed.

  • 81.
    Bajwa, Ahmer Ali
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Awan, Junaid Anwar
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Performance study of IEEE 802.16d (Fixed WiMAX) Physical layer with Adaptive Antenna System2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In this thesis work, WiMAX (IEEE 802.16d) PHY layer with underlying OFDM technology and an optional feature called Adaptive Antenna System has been considered. The SUI-3 channel model (Rician fading) is used for creating fading phenomena. An Adaptive Antenna System has been deployed at the receiver module to reduce the fading effects caused by SUI-3 channel model. Adaptive Antenna Systems (AAS) uses different beamforming techniques to focus the wireless beam between the base station and the subscriber station. In this thesis, the transmitter (SS) and receiver (BS) are fixed and AAS installed at the receiver is used to direct the main beam towards the desired LOS signal and nulls to the multipath signals. Pre-FFT beamformer based on Least Mean Square (LMS) algorithm is used. Different aspects of the complete system model were investigated such as adaptive modulation, angle of arrival of the incoming signals and number of array elements. It has been demonstrated through MATLAB simulations that the performance of the system significantly improves by AAS, where beamforming is implemented in the direction of desired user. The performance of the system further increases by increasing the number of antennas at receiver.

  • 82.
    Bajwa, Sohaib-Shahid
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Investigating the Nature of Relationship between Software Size and Development Effort2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Software effort estimation still remains a challenging and debatable research area. Most of the software effort estimation models take software size as the base input. Among the others, Constructive Cost Model (COCOMO II) is a widely known effort estimation model. It uses Source Lines of Code (SLOC) as the software size to estimate effort. However, many problems arise while using SLOC as a size measure due to its late availability in the software life cycle. Therefore, a lot of research has been going on to identify the nature of relationship between software functional size and effort since functional size can be measured very early when the functional user requirements are available. There are many other project related factors that were found to be affecting the effort estimation based on software size. Application Type, Programming Language, Development Type are some of them. This thesis aims to investigate the nature of relationship between software size and development effort. It explains known effort estimation models and gives an understanding about the Function Point and Functional Size Measurement (FSM) method. Factors, affecting relationship between software size and development effort, are also identified. In the end, an effort estimation model is developed after statistical analyses. We present the results of an empirical study which we conducted to investigate the significance of different project related factors on the relationship between functional size and effort. We used the projects data in the International Software Benchmarking Standards Group (ISBSG) dataset. We selected the projects which were measured by utilizing the Common Software Measurement International Consortium (COSMIC) Function Points. For statistical analyses, we performed step wise Analysis of Variance (ANOVA) and Analysis of Co-Variance (ANCOVA) techniques to build the multi variable models. We also performed Multiple Regression Analysis to formalize the relation.

  • 83.
    Bakht, Syed Sikandar
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Ahmad, Qazi Sohail
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    A Multi Agent Web Based Simulation Model for Evaluating Container Terminal Management2006Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    This thesis provides a software prototype of Container Terminal Management system with the help of a Multi Agent systems technology. The goal that has been tried to achieve during this research work was to solve the management issues residing in a CT. The software prototype can be implemented as simulation software that will help the Terminal Managers to take necessary decisions for the better productivity of CT. The CTs are struggling with taking proper management decisions. There are many policies implemented but the use of a certain policy at a proper time is the main issue. It is possible with simulation software to visualize the affects of decisions taken by the implementation of a policy and see the expected output. This can really improve the performance of a CT. The management decision problem is solved by modeling the whole CT in a computer modeling language. The prototype shows all the actors appearing in a CT in the form of Agents and these agents are responsible for carrying out certain tasks. The prototype is the final contribution along with partial implementation. The model is proposed to be a web based system which removes the platform dependability problem and provide availability online.

  • 84.
    Bakht, Syed Sikandar
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Ahmad, Qazi Sohail
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    A Multi Agent Web Based Simulation Model for Evaluating Container Terminal Management2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    This thesis provides a software prototype of Container Terminal Management system with the help of a Multi Agent systems technology. The goal that has been tried to achieve during this research work was to solve the management issues residing in a CT. The software prototype can be implemented as simulation software that will help the Terminal Managers to take necessary decisions for the better productivity of CT. The CTs are struggling with taking proper management decisions. There are many policies implemented but the use of a certain policy at a proper time is the main issue. It is possible with simulation software to visualize the affects of decisions taken by the implementation of a policy and see the expected output. This can really improve the performance of a CT. The management decision problem is solved by modeling the whole CT in a computer modeling language. The prototype shows all the actors appearing in a CT in the form of Agents and these agents are responsible for carrying out certain tasks. The prototype is the final contribution along with partial implementation. The model is proposed to be a web based system which removes the platform dependability problem and provide availability online.

  • 85. Ballal, Tarig
    et al.
    Grbic, Nedelko
    Mohammed, Abbas
    A Simple and Computationally Efficient Algorithm for Real-Time Blind Source Separation of Speech Mixtures2006Conference paper (Refereed)
    Abstract [en]

    In this paper we exploit the amplitude diversity provided by two sensors to achieve blind separation of two speech sources. We propose a simple and highly computationally efficient method for separating sources that are W-disjoint orthogonal (W-DO), that are sources whose time-frequency representations are disjoint sets. The Degenerate Unmixing and Estimation Technique (DUET), a powerful and efficient method that exploits the W-disjoint orthogonality property, requires extensive computations for maximum likehood parameter learning. Our proposed method avoids all the computations required for parameters estimation by assuming that the sources are "cross high-low diverse (CH-LD)", an assumption that is explained later and that can be satisfied exploiting the sensors settings/directions. With this assumption and the W-disjoint orthogonality property, two binary time-frequency masks that can extract the original sources from one of the two mixtures, can be constructed directly from the amplitude ratios of the time-frequency points of the two mixtures. The method works very well when tested with both artificial and real mixtures. Its performance is comparable to DUET, and it requires only 2% of the computations required by the DUET method. Moreover, it is free of convergence problems that lead to poor SIR ratios in the first parts of the signals. As with all binary masking approaches, the method suffers from artifacts that appear in the output signals.

  • 86. Ballal, Tarig
    et al.
    Grbic, Nedelko
    Mohammed, Abbas
    Sensor Array Blind Source Separation using Time-Frequency Masking2006Conference paper (Refereed)
  • 87.
    Bardici, Nick
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Skarin, Björn
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Röstigenkänning genom Hidden Markov Model: En implementering av teorin på DSP2006Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This master degree project is how to implement a speech recognition system on a DSK – ADSP-BF533 EZ-KIT LITE REV 1.5 based on the theory of the Hidden Markov Model (HMM). The implementation is based on the theory in the master degree project Speech Recognition using Hidden Markov Model by Mikael Nilsson and Marcus Ejnarsson, MEE-01-27. The work accomplished in the project is by reference to the theory, implementing a MFCC, Mel Frequency Cepstrum Coefficient function, a training function, which creates Hidden Markov Models of specific utterances and a testing function, testing utterances on the models created by the training-function. These functions where first created in MatLab. Then the test-function where implemented on the DSK. An evaluation of the implementation is performed.

  • 88.
    Barke, Daniel
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Are we ready for Agile Development?2009Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In the rapidly changing market of today, companies need to be responsive and react quickly to changes in both their competitors’ behaviour but also to changes in their own technical environment. In this thesis I have examined the agile characteristics of a number of companies in Stockholm, with focus on three agile concepts; Scrum, eXtreme Programming and Test Driven Development. The work started off by a prestudy, in which I have identified the criteria that a company needs to fulfil in order to be considered agile. This resulted in four main categories of characteristics; Quality, Flexibility, Communication and Competence. After doing the prestudy, these characteristics were investigated through a combination of a quantitative study and a case study. While the results mostly lean towards agile behaviour rather than non agile, it was shown that a lot of work still remains, for instance regarding improvements in the communications area, and also in the way these companies apply the agile methodologies examined.

  • 89. Barney, Sebastian
    et al.
    Aurum, Aybüke
    Wohlin, Claes
    A Product Management Challenge: Creating Software Product Value through Requirements Selection2008In: Journal of systems architecture, ISSN 1383-7621, E-ISSN 1873-6165, Vol. 54, no 6, p. 576-593Article in journal (Refereed)
    Abstract [en]

    It is important for a software company to maximize value creation for a given investment. The purpose of requirements engineering activities is to add business value that is accounted for in terms of return on investment of a software product. This paper provides insight into the release planning processes used in the software industry to create software product value, by presenting three case studies. It examines how IT professionals perceive value creation through requirements engineering and how the release planning process is conducted to create software product value. It also presents to what degree the major stakeholders' perspectives are represented in the decision-making process. Our findings show that the client and market base of the software product represents the most influential group in the decision to implement specific requirements. This is reflected both in terms of deciding the processes followed and the decision-making criteria applied when selecting requirements for the product. Furthermore, the management of software product value is dependant on the context in which the product exists. Factors, such as the maturity of the product, the marketplace in which it exists, and the development tools and methods available, influence the criteria that decide whether a requirement is included in a specific project or release.

  • 90. Barney, Sebastian
    et al.
    Aurum, Aybüke
    Wohlin, Claes
    A product management challenge: Creating software product value through requirements selection2008Conference paper (Refereed)
    Abstract [en]

    It is important for a software company to maximize value creation for a given investment. The purpose of requirements engineering activities is to add business value that is accounted for in terms of return on investment of a software product. This paper provides insight into the release planning processes used in the software industry to create software product value, by presenting three case studies. It examines how IT professionals perceive value creation through requirements engineering and how the release planning process is conducted to create software product value. It also presents to what degree the major stakeholders' perspectives are represented in the decision-making process. Our findings show that the client and market base of the software product represents the most influential group in the decision to implement specific requirements. This is reflected both in terms of deciding the processes followed and the decision-making criteria applied when selecting requirements for the product. Furthermore, the management of software product value is dependant on the context in which the product exists. Factors, such as the maturity of the product, the marketplace in which it exists, and the development tools and methods available, influence the criteria that decide whether a requirement is included in a specific project or release. (C) 2007 Elsevier B.V. All rights reserved.

  • 91. Barney, Sebastian
    et al.
    Aurum, Aybüke
    Wohlin, Claes
    Quest for a Silver Bullet: Creating Software Product Value through Requirements Selection2006Conference paper (Refereed)
  • 92. Barney, Sebastian
    et al.
    Wohlin, Claes
    Software Product Quality: Ensuring a Common Goal2009Conference paper (Refereed)
    Abstract [en]

    Software qualities are in many cases tacit and hard to measure. Thus, there is a potential risk that they get lower priority than deadlines, cost and functionality. Yet software qualities impact customers, profits and even developer efficiency. This paper presents a method to evaluate the priority of software qualities in an industrial context. The method is applied in an exploratory case study, where the ISO 9126 model for software quality is combined with Theory-W to create a process for evaluating the alignment between success- critical stakeholder groups in the area of software product quality. The results of the case study using this tool is then presented and discussed. It is shown that the method provides valuable information about software qualities.

  • 93.
    Bartunek, Josef Ström
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Minutiae Extraction from Fingerprint with Neural Network and Minutiae based Fingerprint Verification2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Human fingerprints are rich in details called minutiae, which can be used as identification marks for fingerprint verification. The goal of this thesis is to develop a complete system for fingerprint verification through extracting and matching minutiae. A neural network is trained using the back-propagation algorithm and will work as a classifier to locate various minutiae. To achieve good minutiae extraction in fingerprints with varying quality, preprocessing in form of binarization and skeletonization is first applied on fingerprints before they are evaluated by the neural network. Extracted minutiae are then considered as a 2D point pattern problem and an algorithm is used to determine the number of matching points between two point patterns. Performance of the developed system is evaluated on a database with fingerprints from different people and experimental results are presented.

  • 94. Bartunek, Josef Ström
    et al.
    Nilsson, Mikael
    Nordberg, Jörgen
    Claesson, Ingvar
    Adaptive Fingerprint Binarization by Frequency Domain Analysis2006Conference paper (Other academic)
    Abstract [en]

    This paper presents a new approach for fingerprint enhancement by using directional filters and binarization. A straightforward method for automatically tuning the size of local area is obtained by analyzing entire fingerprint image in the frequency domain. Hence, the algorithm will adjust adaptively to the local area of the fingerprint image, independent on the characteristics of the fingerprint sensor or the physical appearance of the fingerprints. Frequency analysis is carried out in the local areas to design directional filters. Experimental results are presented.

  • 95. Bartunek, Josef Ström
    et al.
    Nilsson, Mikael
    Nordberg, Jörgen
    Claesson, Ingvar
    Improved Adaptive Fingerprint Binarization2008Conference paper (Refereed)
    Abstract [en]

    In this paper improvements to a previous work are presented. Removing the redundant artifacts in the fingerprint mask is introduced enhancing the final result. The proposed method is entirely adaptive process adjusting to each fingerprint without any further supervision of the user. Hence, the algorithm is insensitive to the characteristics of the fingerprint sensor and the various physical appearances of the fingerprints. Further, a detailed description of fingerprint mask generation not fully described in the previous work is presented. The improved experimental results are presented.

  • 96. Bartunek, Josef Ström
    et al.
    Nilsson, Mikael
    Nordberg, Jörgen
    Claesson, Ingvar
    Neural Network based Minutiae Extraction from Skeletonized Fingerprints2006Conference paper (Other academic)
    Abstract [en]

    Human fingerprints are rich in details denoted minutiae. In this paper a method of minutiae extraction from fingerprint skeletons is described. To identify the different shapes and types of minutiae a neural network is trained to work as a classifier. The proposed neural network is applied throughout the fingerprint skeleton to locate various minutiae. A scheme to speed up the process is also presented. Extracted minutiae can then be used as identification marks for automatic fingerprint matching.

  • 97.
    Basa, Srinivas
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Ganji, Naveen
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Enhanced NMS Tool Architecture for Discovery and Monitoring of Nodes2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The thesis work intends to provide an architecture for discovery and monitoring of nodes in a network with improved performance and security. The proposed work addresses limitations identified within the scope of this thesis. The limitations are identified by analyzing some of the better existing monitoring tools in the market with the use of different protocols. The proposed work use different protocols depending on the situation of the problem that exists in a network. Analyze the existing network monitoring tools, by performing metrics and overcoming the limitations. We proposed a new architecture motivated from traditional network monitoring tools with subtle changes. Proposed architecture is also conceptually evaluated for its viability.

  • 98.
    Basit, Syed Abdul
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Malik, Omar
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Planning and Analysis of Knowledge Intensive Enterprise Resource Planning Systems2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    ERP software and applications have become basic requirement of almost every organization in order to compete with each other and in time constraint. In order to develop an efficient application, project planning and analysis play very important role in better understanding of the problem domain and to provide a risk free solution. There are many different approaches which software developers used to develop the systems. These traditional approaches have some drawbacks and constraints. Either these are ad-hoc basis or have some fixed patterns and rules. We discussed all these techniques and suggest that planning and analysis of ERP application during its development can be done by applying more appropriate knowledge engineering commonKADS model. CommonKADS is a structured approach, It comprises of different model suites. Thesis presents that by using commonKADS model for project planning and analysis, real problem domain and efficient solution can be identified. Also domain process is identified. Tasks related to each process in the domain are identified. Knowledge assets related to each task are identified. These features help in defining real knowledge specification. In this way, ERP applications can be made knowledge based. ERP systems were introduced to solve different organizational problems and provide integrated structure. Although ERP packages offer advantages to enterprises, they have not achieved many of their anticipated benefits. Autonomous and heterogeneous applications co-exist in companies with ERP systems and integration problem having not been addressed. This thesis seeks to make some suggestions to this area by studying and analyzing ERP problems, through mapping commonKADS methodology in a case study. Thesis in start, presents an overview about ERP applications, Knowledge Engineering and commonKADS methodology. In the end, thesis presents our contribution a case study ―online courses Registration Portal for BTH which shows that planning and analysis of ERP applications by using commonKADS methodology helps in reaching knowledge based and more accurate solutions.

  • 99.
    Bassey, Isong
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Adedigba, Adetayo
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Structural Design of an RFID-Based System: a way of solving some election problems in Africa2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In this thesis work, two major problems confronting elections system in Africa; multiple registrations and diversion/shortages of election materials, taking the Nigerian content into consideration is addressed. These problems have been described as being so corrosive in nature such that ICTs in the form of eVoting if fully implemented will only compound or exacerbate the current situation due to poor ICTs awareness in the continent. However, in order to contain these problems with some form of ICTs tools along side the traditional election system, we proposed an RFID-based framework where voter’s identification and election materials are RFID-based. We believe this will enhance effective and efficient identification and tracking. Operations similar to the chain supply and inventory management are utilized. Also benefits resulting from the adoption of this framework; national ID card, national register, etc. are addressed.

  • 100. Bayer, J.
    et al.
    Eisenbarth, M.
    Lehner, T.
    Petersen, Kai
    Service Engineering Methodology2008In: Semantic Service Provisioning / [ed] Kuropka, D.; Tröger, P.; Weske, S. Staab and M., Berlin: Springer Verlag , 2008, p. 185-202Chapter in book (Refereed)
1234567 51 - 100 of 1407
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf