Change search
Refine search result
1234567 1 - 50 of 1407
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the 'Create feeds' function.
  • 1.
    Abari, Farzad Foroughi
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Optimization of Audio Processing algorithms (Reverb) on ARMv6 family of processors2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Audio processing algorithms are increasingly used in cell phones and today’s customers are placing more demands on cell phones. Feature phones, once the advent of mobile phone technology, nowadays do more than just providing the user with MP3 play back or advanced audio effects. These features have become an integral part of medium as well as low-end phones. On the other hand, there is also an endeavor to include as improved quality as possible into products to compete in market and satisfy users’ needs. Tackling the above requirements has been partly satisfied by the advance in hardware design and manufacturing technology. However, as new hardware emerges into market the need for competence to write efficient software and exploit the new features thoroughly and effectively arises. Even though compilers are also keeping up with the new tide space for hand optimized code still exist. Wrapped in the above goal, an effort was made in this thesis to partly cover the competence requirement at Multimedia Section (part of Ericsson Mobile Platforms) to develope optimized code for new processors. Forging persistently ahead with new products, EMP has always incorporated the latest technology into its products among which ARMv6 family of processors has the main central processing role in a number of upcoming products. To fully exploit latest features provided by ARMv6, it was required to probe its new instruction set among which new media processing instructions are of outmost importance. In order to execute DSP-intensive algorithms (e.g. Audio Processing algorithms) efficiently, the implementation should be done in low-level code applying available instruction set. Meanwhile, ARMv6 comes with a number of new features in comparison with its predecessors. SIMD (Single Instruction Multiple Data) and VFP (Vector Floating Point) are the most prominent media processing improvements in ARMv6. Aligned with thesis goals and guidelines, Reverb algorithm which is among one of the most complicated audio features on a hand-held devices was probed. Consequently, its kernel parts were identified and implementation was done both in fixed-point and floating-point using the available resources on hardware. Besides execution time and amount of code memory for each part were measured and provided in tables and charts for comparison purposes. Conclusions were finally drawn based on developed code’s efficiency over ARM compiler’s as well as existing code already developed and tailored to ARMv5 processors. The main criteria for optimization was the execution time. Moreover, quantization effect due to limited precision fixed-point arithmetic was formulated and its effect on quality was elaborated. The outcomes, clearly indicate that hand optimization of kernel parts are superior to Compiler optimized alternative both from the point of code memory as well as execution time. The results also confirmed the presumption that hand optimized code using new instruction set can improve efficiency by an average 25%-50% depending on the algorithm structure and its interaction with other parts of audio effect. Despite its many draw backs, fixed-point implementation remains yet to be the dominant implementation for majority of DSP algorithms on low-power devices.

  • 2.
    Abelsson, Sara
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Propagation Measurements at 3.5 GHz for WiMAX2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Propagation measurements at the frequency 3.5 GHz for the WiMAX technology have been conducted. The purpose of these measurements is that a coverage analysis should be accomplished. The mathematical software package MATLAB has been used to analyze the collected data from the measurement campaign. Path loss models have also been used and a comparison between these models and the collected data has been performed. An analysis prediction tool from an application called WRAP has also been used in the comparison with the collected data. In this thesis, diff

  • 3.
    Abrahamsson, Charlotte
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Wessman, Mattias
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    WLAN Security: IEEE 802.11b or Bluetooth - which standard provides best security methods for companies?2004Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Which security holes and security methods do IEEE 802.11b and Bluetooth offer? Which standard provides best security methods for companies? These are two interesting questions that this thesis will be about. The purpose is to give companies more information of the security aspects that come with using WLANs. An introduction to the subject of WLAN is presented in order to give an overview before the description of the two WLAN standards; IEEE 802.11b and Bluetooth. The thesis will give an overview of how IEEE 802.11b and Bluetooth works, a in depth description about the security issues of the two standards will be presented, security methods available for companies, the security flaws and what can be done in order to create a secure WLAN are all important aspects to this thesis. In order to give a guidance of which WLAN standard to choose, a comparison of the two standards with the security issues in mind, from a company's point of view is described. We will present our conclusion which entails a recommendation to companies to use Bluetooth over IEEE 802.11b, since it offers better security methods.

  • 4.
    Abu-Sheikh, Khalil
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Reviewing and Evaluating Techniques for Modeling and Analyzing Security Requirements2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The software engineering community recognized the importance of addressing security requirements with other functional requirements from the beginning of the software development life cycle. Therefore, there are some techniques that have been developed to achieve this goal. Thus, we conducted a theoretical study that focuses on reviewing and evaluating some of the techniques that are used to model and analyze security requirements. Thus, the Abuse Cases, Misuse Cases, Data Sensitivity and Threat Analyses, Strategic Modeling, and Attack Trees techniques are investigated in detail to understand and highlight the similarities and differences between them. We found that using these techniques, in general, help requirements engineer to specify more detailed security requirements. Also, all of these techniques cover the concepts of security but in different levels. In addition, the existence of different techniques provides a variety of levels for modeling and analyzing security requirements. This helps requirements engineer to decide which technique to use in order to address security issues for the system under investigation. Finally, we found that using only one of these techniques will not be suitable enough to satisfy the security requirements of the system under investigation. Consequently, we consider that it would be beneficial to combine the Abuse Cases or Misuse Cases techniques with the Attack Trees technique or to combine the Strategic Modeling and Attack Trees techniques together in order to model and analyze security requirements of the system under investigation. The concentration on using the Attack Trees technique is due to the reusability of the produced attack trees, also this technique helps in covering a wide range of attacks, thus covering security concepts as well as security requirements in a proper way.

  • 5.
    Adeyinka, Oluwaseyi
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Service Oriented Architecture & Web Services: Guidelines for Migrating from Legacy Systems and Financial Consideration2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The purpose of this study is to present guidelines that can be followed when introducing Service-oriented architecture through the use of Web services. This guideline will be especially useful for organizations migrating from their existing legacy systems where the need also arises to consider the financial implications of such an investment whether it is worthwhile or not. The proposed implementation guide aims at increasing the chances of IT departments in organizations to ensure a successful integration of SOA into their system and secure strong financial commitment from the executive management. Service oriented architecture technology is a new concept, a new way of looking at a system which has emerged in the IT world and can be implemented by several methods of which Web services is one platform. Since it is a developing technology, organizations need to be cautious on how to implement this technology to obtain maximum benefits. Though a well-designed, service-oriented environment can simplify and streamline many aspects of information technology and business, achieving this state is not an easy task. Traditionally, management finds it very difficult to justify the considerable cost of modernization, let alone shouldering the risk without achieving some benefits in terms of business value. The study identifies some common best practices of implementing SOA and the use of Web services, steps to successfully migrate from legacy systems to componentized or service enabled systems. The study also identified how to present financial return on investment and business benefits to the management in order to secure the necessary funds. This master thesis is based on academic literature study, professional research journals and publications, interview with business organizations currently working on service oriented architecture. I present guidelines that can be of assistance to migrate from legacy systems to service-oriented architecture based on the analysis from comparing information sources mentioned above.

  • 6.
    Adolfsson, Henrik
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Svensson, Peter
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Design and implementation of the MMS portal2006Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    MMS-enabled terminals on the market today are very complicated to use. It takes several steps to create a multi-slide MMS-message with images and text. This discourages users from using it. To increase usage of MMS, several companies provide web-based or stand-alone programs that allow users to create and send MMS-messages from a regular computer. However these editors have many limitations and are not user-friendly. This thesis describes the design and implementation of a user-friendly web-based MMS-portal where users can create, edit and send MMS-messages. The portal is integrated into Densitet’s system for development of mobile services. Conclusions that can be draw from this work are that problems with MMS interoperability have mostly the poor standardization to blame. Different terminals support different types of images and sound formats, and to make the MMS-portal user-friendly, format conversions of uploaded content had to be implemented. Also the MMS-portal only supports basic MMS-functionality. If the MMS-specification includes more audio and image formats and if the MMS-terminals are upgraded to handle these formats, sending MMS-messages will be easier and mobile messaging will continue to grow.

  • 7.
    Aftarczuk, Kamila
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Evaluation of selected data mining algorithms implemented in Medical Decision Support Systems2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The goal of this master’s thesis is to identify and evaluate data mining algorithms which are commonly implemented in modern Medical Decision Support Systems (MDSS). They are used in various healthcare units all over the world. These institutions store large amounts of medical data. This data may contain relevant medical information hidden in various patterns buried among the records. Within the research several popular MDSS’s are analyzed in order to determine the most common data mining algorithms utilized by them. Three algorithms have been identified: Naïve Bayes, Multilayer Perceptron and C4.5. Prior to the very analyses the algorithms are calibrated. Several testing configurations are tested in order to determine the best setting for the algorithms. Afterwards, an ultimate comparison of the algorithms orders them with respect to their performance. The evaluation is based on a set of performance metrics. The analyses are conducted in WEKA on five UCI medical datasets: breast cancer, hepatitis, heart disease, dermatology disease, diabetes. The analyses have shown that it is very difficult to name a single data mining algorithm to be the most suitable for the medical data. The results gained for the algorithms were very similar. However, the final evaluation of the outcomes allowed singling out the Naïve Bayes to be the best classifier for the given domain. It was followed by the Multilayer Perceptron and the C4.5.

  • 8. Afzal, Wasif
    Lessons from applying experimentation in software engineering prediction systems2008Conference paper (Refereed)
    Abstract [en]

    Within software engineering prediction systems, experiments are undertaken primarliy to investigate relationships and to measure/compare models' accuracy. This paper discusses our experience and presents useful lessons/guidelines in experimenting with software engineering prediction systems. For this purpose, we use a typical software engineering experimentation process as a baseline. We found that the typical software engineering experimentation process in software engineering is supportive in developing prediction systems and have highlighted issues more central to the domain of software engineering prediction systems.

  • 9.
    Afzal, Wasif
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Metrics in Software Test Planning and Test Design Processes2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Software metrics plays an important role in measuring attributes that are critical to the success of a software project. Measurement of these attributes helps to make the characteristics and relationships between the attributes clearer. This in turn supports informed decision making. The field of software engineering is affected by infrequent, incomplete and inconsistent measurements. Software testing is an integral part of software development, providing opportunities for measurement of process attributes. The measurement of software testing process attributes enables the management to have better insight in to the software testing process. The aim of this thesis is to investigate the metric support for software test planning and test design processes. The study comprises of an extensive literature study and follows a methodical approach. This approach consists of two steps. The first step comprises of analyzing key phases in software testing life cycle, inputs required for starting the software test planning and design processes and metrics indicating the end of software test planning and test design processes. After establishing a basic understanding of the related concepts, the second step identifies the attributes of software test planning and test design processes including metric support for each of the identified attributes. The results of the literature survey showed that there are a number of different measurable attributes for software test planning and test design processes. The study partitioned these attributes in multiple categories for software test planning and test design processes. For each of these attributes, different existing measurements are studied. A consolidation of these measurements is presented in this thesis which is intended to provide an opportunity for management to consider improvement in these processes.

  • 10. Afzal, Wasif
    Search-based approaches to software fault prediction and software testing2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software verification and validation activities are essential for software quality but also constitute a large part of software development costs. Therefore efficient and cost-effective software verification and validation activities are both a priority and a necessity considering the pressure to decrease time-to-market and intense competition faced by many, if not all, companies today. It is then perhaps not unexpected that decisions related to software quality, when to stop testing, testing schedule and testing resource allocation needs to be as accurate as possible. This thesis investigates the application of search-based techniques within two activities of software verification and validation: Software fault prediction and software testing for non-functional system properties. Software fault prediction modeling can provide support for making important decisions as outlined above. In this thesis we empirically evaluate symbolic regression using genetic programming (a search-based technique) as a potential method for software fault predictions. Using data sets from both industrial and open-source software, the strengths and weaknesses of applying symbolic regression in genetic programming are evaluated against competitive techniques. In addition to software fault prediction this thesis also consolidates available research into predictive modeling of other attributes by applying symbolic regression in genetic programming, thus presenting a broader perspective. As an extension to the application of search-based techniques within software verification and validation this thesis further investigates the extent of application of search-based techniques for testing non-functional system properties. Based on the research findings in this thesis it can be concluded that applying symbolic regression in genetic programming may be a viable technique for software fault prediction. We additionally seek literature evidence where other search-based techniques are applied for testing of non-functional system properties, hence contributing towards the growing application of search-based techniques in diverse activities within software verification and validation.

  • 11. Afzal, Wasif
    Using faults-slip-through metric as a predictor of fault-proneness2010Conference paper (Refereed)
    Abstract [en]

    The majority of software faults are present in small number of modules, therefore accurate prediction of fault-prone modules helps improve software quality by focusing testing efforts on a subset of modules. This paper evaluates the use of the faults-slip-through (FST) metric as a potential predictor of fault-prone modules. Rather than predicting the fault-prone modules for the complete test phase, the prediction is done at the specific test levels of integration and system test. We applied eight classification techniques to the task of identifying fault-prone modules, representing a variety of approaches, including a standard statistical technique for classification (logistic regression), tree-structured classifiers (C4.5 and random forests), a Bayesian technique (Na\"{i}ve Bayes), machine-learning techniques (support vector machines and back-propagation artificial neural networks) and search-based techniques (genetic programming and artificial immune recognition systems) on FST data collected from two large industrial projects from the telecommunication domain. \emph{Results:} Using area under the receiver operating characteristic (ROC) curve and the location of (PF, PD) pairs in the ROC space, GP showed impressive results in comparison with other techniques for predicting fault-prone modules at both integration and system test levels. The use of faults-slip-through metric in general provided good prediction results at the two test levels. The accuracy of GP is statistically significant in comparison with majority of the techniques for predicting fault-prone modules at integration and system test levels. (ii) Faults-slip-through metric has the potential to be a generally useful predictor of fault-proneness at integration and system test levels.

  • 12. Afzal, Wasif
    et al.
    Torkar, Richard
    A Comparative Evaluation of Using Genetic Programming for Predicting Fault Count Data2008Conference paper (Refereed)
    Abstract [en]

    There have been a number of software reliability growth models (SRGMs) proposed in literature. Due to several reasons, such as violation of models' assumptions and complexity of models, the practitioners face difficulties in knowing which models to apply in practice. This paper presents a comparative evaluation of traditional models and use of genetic programming (GP) for modeling software reliability growth based on weekly fault count data of three different industrial projects. The motivation of using a GP approach is its ability to evolve a model based entirely on prior data without the need of making underlying assumptions. The results show the strengths of using GP for predicting fault count data.

  • 13.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Torkar, Richard
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Incorporating Metrics in an Organizational Test Strategy2008Conference paper (Refereed)
    Abstract [en]

    An organizational level test strategy needs to incorporate metrics to make the testing activities visible and available to process improvements. The majority of testing measurements that are done are based on faults found in the test execution phase. In contrast, this paper investigates metrics to support software test planning and test design processes. We have assembled metrics in these two process types to support management in carrying out evidence-based test process improvements and to incorporate suitable metrics as part of an organization level test strategy. The study is composed of two steps. The first step creates a relevant context by analyzing key phases in the software testing lifecycle, while the second step identifies the attributes of software test planning and test design processes along with metric(s) support for each of the identified attributes.

  • 14. Afzal, Wasif
    et al.
    Torkar, Richard
    Suitability of Genetic Programming for Software Reliability Growth Modeling2008Conference paper (Refereed)
    Abstract [en]

    Genetic programming (GP) has been found to be effective in finding a model that fits the given data points without making any assumptions about the model structure. This makes GP a reasonable choice for software reliability growth modeling. This paper discusses the suitability of using GP for software reliability growth modeling and highlights the mechanisms that enable GP to progressively search for fitter solutions.

  • 15. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    A Systematic Mapping Study on Non-Functional Search-Based Software Testing2008Conference paper (Refereed)
  • 16. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    Prediction of fault count data using genetic programming2008Conference paper (Refereed)
    Abstract [en]

    Software reliability growth modeling helps in deciding project release time and managing project resources. A large number of such models have been presented in the past. Due to the existence of many models, the models' inherent complexity, and their accompanying assumptions; the selection of suitable models becomes a challenging task. This paper presents empirical results of using genetic programming (GP) for modeling software reliability growth based on weekly fault count data of three different industrial projects. The goodness of fit (adaptability) and predictive accuracy of the evolved model is measured using five different measures in an attempt to present a fair evaluation. The results show that the GP evolved model has statistically significant goodness of fit and predictive accuracy.

  • 17.
    Agushi, Camrie
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Innovation inom Digital Rights Management2005Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The thesis deals with the topic of Digital Rights Management (DRM), more specifically the innovation trends within DRM. It is focused on three driving forces of DRM. Firstly, DRM technologies, secondly, DRM standards and thirdly, DRM interoperability. These driving forces are discussed and analyzed in order to explore innovation trends within DRM. In the end, a multi-facetted overview of today’s DRM context is formed. One conclusion is that the aspect of Intellectual Property Rights is considered to be an important indicator of the direction DRM innovation is heading.

  • 18.
    Ahl, Viggo
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    An experimental comparison of five prioritization methods: Investigating ease of use, accuracy and scalability2005Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Requirements prioritization is an important part of developing the right product in the right time. There are different ideas about which method is the best to use when prioritizing requirements. This thesis takes a closer look at five different methods and then put them into an controlled experiment, in order to find out which of the methods that would be the best method to use. The experiment was designed to find out which method yields the most accurate result, the method’s ability to scale up to many more requirements, what time it took to prioritize with the method, and finally how easy the method was to use. These four criteria combined will indicate which method is more suitable, i.e. be the best method, to use in prioritizing of requirements. The chosen methods are the well-known analytic hierarchy process, the computer algorithm binary search tree, and from the ideas of extreme programming come planning game. The fourth method is an old but well used method, the 100 points method. The last method is a new method, which combines planning game with the analytic hierarchy process. Analysis of the data from the experiment indicates that the planning game combined with analytic hierarchy process could be a good candidate. However, the result from the experiment clearly indicates that the binary search tree yields accurate result, is able to scale up and was the easiest method to use. For these three reasons the binary search tree clearly is the better method to use for prioritizing requirements

  • 19. Ahlin, Kjell
    et al.
    Magnevall, Martin
    Josefsson, Andreas
    Simulation of forced response in linear and nonlinear mechanical systems using digital filters2006Conference paper (Refereed)
    Abstract [en]

    There exist many methods to calculate forced response in mechanical systems. Some methods are slow and the errors introduced are unknown. The paper presents a method that uses digital filters and modal superposition. It is shown how aliasing can be avoided as well as phase errors. The parameters describing the mechanical system are residues and poles, taken from FEA models, from lumped MCK systems, from analytic solutions or from experimental modal analysis. Modal damping may be used. The error in the calculation is derived and is shown to be only a function of the sampling frequency used. When the method is applied to linear mechanical systems in MATLAB it is very fast. The method is extended to incorporate nonlinear components. The nonlinear components could be simple, like hardening or stiffening springs, but may also contain memory, like dampers with hysteresis. The simulations are used to generate test data for development and evaluation of methods for identification of non-linear systems.

  • 20.
    Ahmad, Arshad
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Khan, Hashim
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    The Importance of Knowledge Management Practices in Overcoming the Global Software Engineering Challenges in Requirements Understanding2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Going offshore has become a norm in current software organizations due to several benefits like availability of competent people, cost, proximity to market and customers, time and so on. Despite the fact that Global Software Engineering (GSE) offers many benefits to software organizations but it has also created several challenges/issues for practitioners and researchers like culture, communication, co-ordination and collaboration, team building and so on. As Requirements Engineering (RE) is more human intensive activity and is one of the most challenging and important phase in software development. Therefore, RE becomes even more challenging when comes to GSE context because of culture, communication, coordination, collaboration and so on. Due to the fore mentioned GSE factors, requirements’ understanding has become a challenge for software organizations involved in GSE. Furthermore, Knowledge Management (KM) is considered to be the most important asset of an organization because it not only enables organizations to efficiently share and create knowledge but also helps in resolving culture, communication and co-ordination issues especially in GSE. The aim of this study is to present how KM practices helps globally dispersed software organizations in requirements understanding. For this purpose a thorough literature study is performed along with interviews in two industries with the intent to identify useful KM practices and challenges of requirements understanding in GSE. Then based on the analysis of identified challenges of requirements understanding in GSE both from literature review and industrial interviews, useful KM practices are shown and discussed to reduce requirements understanding issues faced in GSE.

  • 21.
    Ahmad, Ehsan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Raza, Bilal
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Towards Optimization of Software V&V Activities in the Space Industry [Two Industrial Case Studies]2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Developing software for high-dependable space applications and systems is a formidable task. With new political and market pressures on the space industry to deliver more software at a lower cost, optimization of their methods and standards need to be investigated. The industry has to follow standards that strictly sets quality goals and prescribes engineering processes and methods to fulfill them. The overall goal of this study is to evaluate if current use of ECSS standards is cost efficient and if there are ways to make the process leaner while still maintaining the quality and to analyze if their V&V activities can be optimized. This paper presents results from two industrial case studies of companies in the European space industry that are following ECSS standards and have various V&V activities. The case studies reported here focused on how the ECSS standards were used by the companies and how that affected their processes and how their V&V activities can be optimized.

  • 22.
    Ahmad, Naseer
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Security Issues in Wireless Systems2009Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    ireless Communication is one of the fields of Telecommunications which is growing with the tremendous speed. With the passage of time wireless communication devices are becoming more and more common. It is not only the technology of business but now people are using it to perform their daily tasks, be it for calling, shopping, checking their emails or transfer their money. Wireless communication devices include cellular phones, cordless phones and satellite phones, smart phones like Personal Digital Assistants (PDA), two way pagers, and lots of their devices are on their way to improve this wireless world. In order to establish two way communications, a wireless link may be using radio waves or Infrared light. The Wireless communication technologies have become increasingly popular in our everyday life. The hand held devices like Personal Digital Assistants (PDA) allow the users to access calendars, mails, addresses, phone number lists and the internet. Personal digital assistants (PDA) and smart phones can store large amounts of data and connect to a broad spectrum of networks, making them as important and sensitive computing platforms as laptop PCs when it comes to an organization’s security plan. Today’s mobile devices offer many benefits to enterprises. Mobile phones, hand held computers and other wireless systems are becoming a tempting target for virus writers. Mobile devices are the new frontier for viruses, spam and other potential security threats. Most viruses, Trojans and worms have already been created that exploit vulnerabilities. With an increasing amount of information being sent through wireless channels, new threats are opening up. Viruses have been growing fast as handsets increasingly resemble small computers that connect with each other and the internet. Hackers have also discovered that many corporate wireless local area networks (WLAN) in major cities were not properly secured. Mobile phone operators say that it is only a matter of time before the wireless world is hit by the same sorts of viruses and worms that attack computer software.

  • 23.
    Ahmad, Saleem Zubair
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Analyzing Suitability of SysML for System Engineering Applications2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    During last decade UML have to face different tricky challenges. For instance as a single unified, general purpose modeling language it should offer simple and explicit semantic which can be applicable to wide range of domains. Due to significant shift of focus from software to system “software-centric” attitude of UML has been exposed. So need of certain domain specific language is always there which can address problems of system rather then software only i.e. motivation for SysML. In this thesis SysML is evaluated to analyze its suitability for system engineering applications. A evaluation criteria is established, through which appropriateness of SysML is observed over system development life cycle. The study is conducted by taking case example of real life i.e. automobile product. Results of research not only provide an opportunity to get inside into SysML architecture but also offer an idea of SysML appropriateness for multidisciplinary product development

  • 24.
    Ahmed, Adnan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Hussain, Syed Shahram
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Meta-Model of Resilient information System2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The role of information systems has become very important in today’s world. It is not only the business organizations who use information systems but the governments also posses’ very critical information systems. The need is to make information systems available at all times under any situation. Information systems must have the capabilities to resist against the dangers to its services,performance & existence, and recover to its normal working state with the available resources in catastrophic situations. The information systems with such a capability can be called resilient information systems. This thesis is written to define resilient information systems, suggest its meta-model and to explain how existing technologies can be utilized for the development of resilient information system.

  • 25.
    ahmed, amar
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Performance and Modeling of SIP Session Setup2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    During the recent last years, transport of multimedia sessions, such as audio streams and video conferences, over IP has acquired a lot of attention since most of communication technologies are migrating to work over IP. However, sending media streams over IP networks has encountered some problems related to signaling issues. The ongoing research in this area has produced some solutions to this subject. Internet Engineering Task Force (IETF) has introduced Session Initiation Protocol (SIP), which has proved to be an efficient protocol for controlling sessions over IP. While a great deal of research performed in evaluating the performance of SIP and comparing it with its competent protocols such as H.323, studying the delay caused by initiating the session has acquired less attention. In this document, we have addressed the SIP session setup delay problem. In the lab, we have built up a test bed for running several SIP session scenarios. Using different models for those scenarios, we have measured session setup delays for all used models. The analysis performed for each model showed that we could propose some models to be applied for SIP session setup delay components.

  • 26.
    Ahmed, Ishtiaque
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Study of the Local Backprojection Algorithm for Image Formation in Ultra Wideband Synthetic Aperture Radar2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The purpose of this thesis project is to study and evaluate a UWB Synthetic Aperture Radar (SAR) data image formation algorithm, that was previously less familiar and, that has recently got much attention in this field. Certain properties of it made it acquire a status in radar signal processing branch. This is a fast time-domain algorithm named Local Backprojection (LBP). The LBP algorithm has been implemented for SAR image formation. The algorithm has been simulated in MATLAB using standard values of pertinent parameters. Later, an evaluation of the LBP algorithm has been performed and all the comments, estimation and judgment have been done on the basis of the resulting images. The LBP has also been compared with the basic time-domain algorithm Global Backprojection (GBP) with respect to the SAR images. The specialty of LBP algorithm is in its reduced computational load than in GBP. LBP is a two stage algorithm — it forms the beam first for a particular subimage and, in a later stage, forms the image of that subimage area. The signal data collected from the target is processed and backprojected locally for every subimage individually. This is the reason of naming it Local backprojection. After the formation of all subimages, these are arranged and combined coherently to form the full SAR image.

  • 27.
    Ahmed, Sabbir
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Performance of Multi-Channel Medium Access Control Protocol incorporating Opportunistic Cooperative Diversity over Rayleigh Fading Channel2006Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This thesis paper proposes a Medium Access Control (MAC) protocol for wireless networks, termed as CD-MMAC that utilizes multiple channels and incorporates opportunistic cooperative diversity dynamically to improve its performance. The IEEE 802.11b standard protocol allows the use of multiple channels available at the physical layer but its MAC protocol is designed only for a single channel. The proposed protocol utilizes multiple channels by using single interface and incorporates opportunistic cooperative diversity by using cross-layer MAC. The new protocol leverages the multi-rate capability of IEEE 802.11b and allows wireless nodes far away from destination node to transmit at a higher rate by using intermediate nodes as a relays. The protocol improves network throughput and packet delivery ratio significantly and reduces packet delay. The performance improvement is further evaluated by simulation and analysis.

  • 28. Ahmed, Sabbir
    et al.
    Casas, Christian Ibar
    Coso, Aitor del
    Mohammed, Abbas
    Performance of Multi-Channel MAC incorporating Opportunistic Cooperative Diversity2007Conference paper (Refereed)
  • 29.
    Ahmed, Syed Rizwan
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Secure Software Development: Identification of Security Activities and Their Integration in Software Development Lifecycle2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Today’s software is more vulnerable to attacks due to increase in complexity, connectivity and extensibility. Securing software is usually considered as a post development activity and not much importance is given to it during the development of software. However the amount of loss that organizations have incurred over the years due to security flaws in software has invited researchers to find out better ways of securing software. In the light of research done by many researchers, this thesis presents how software can be secured by considering security in different phases of software development life cycle. A number of security activities have been identified that are needed to build secure software and it is shown that how these security activities are related with the software development activities of the software development lifecycle.

  • 30. Aibinu, A.M.
    et al.
    Iqbal, Muhammad Imran
    Nilsson, M.
    Salami, M.J.E.
    A New Method of Correcting Uneven Illumination Problem in Fundus Images2007Conference paper (Refereed)
    Abstract [en]

    Recent advancements in signal and image processing have reduced the time of diagnoses, effort and pressure on the screeners by providing auto diagnostic tools for different diseases. The success rate of these tools greatly depend on the quality of acquired images. Bad image quality can significantly reduce the specificity and the sensitivity which in turn forces screeners back to their tedious job of manual diagnoses. In acquired fundus images, some areas appear to be brighter than the other, that is areas close to the center of the image are always well illuminated, hence appear very bright while areas far from the center are poorly illuminated hence appears to be very dark. Several techniques including the simple thresholding, Naka Rushton (NR) filtering technique and histogram equalization (HE) method have been suggested by various researchers to overcome this problem. However, each of these methods has limitations at their own and hence the need to develop a more robust technique that will provide better performance with greater flexibility. A new method of compensating uneven (irregular) illumination in fundus images termed global-local adaptive histogram equalization using partially-overlapped windows (GLAPOW) is proposed in this paper. The developed algorithm has been tested and the results obtained show superior performance when compared to other known techniques for uneven illumination correction.

  • 31. Aibinu, A.M.
    et al.
    Iqbal, Muhammad Imran
    Nilsson, M.
    Salami, M.J.E.
    Automatic Diagnosis of Diabetic Retinopathy from Fundus Images Using Digital Signal and Image Processing Techniques2007Conference paper (Refereed)
    Abstract [en]

    Automatic diagnosis and display of diabetic retinopathy from images of retina using the techniques of digital signal and image processing is presented in this paper. The acquired images undergo pre-processing to equalize uneven illumination associated with the acquired fundus images. This stage also removes noise present in the image. Segmentation stage clusters the image into two distinct classes while the abnormalities detection stage was used to distinguish between candidate lesions and other information. Methods of diagnosis of red spots, bleeding and detection of vein-artery crossover points have also been developed in this work using the color information, shape, size, object length to breadth ration as contained in the acquired digital fundus image. Furthermore, two graphical user interfaces (GUIs) have also been developed during this work; the first is for the collection of lesion data information and was used by the ophthalmologist in marking images for database while the second GUI is for automatic diagnosing and displaying of the result in a user friendly manner. The algorithm was tested with a separate set of 25 fundus images. From this, the result obtained for microaneurysms and haemorrhages diagnosis shows the appropriateness of the method.

  • 32.
    Aida, Horaniet
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Isabel, Llorente
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Formation of High Resolution Images in SAR using GNSS2009Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    The aim of this thesis is to investigate the possibility to form high resolution Synthetic Aperture Radar (SAR) images using the Global Navigation Satellite System (GNSS) Galileo, GPS and Glonas, In particular the thesis study the GPS signal and evaluate its properties for bistatic case. The report is based on the fact that Galileo and GPS are both positioning systems with similar characteristics. The difference is mainly that Galileo System uses a larger number of satellites and a different modulation scheme to improve the efficiency of the system, resulting in a better accuracy. On the topic of GNSS SAR, the report will be described with modes, resolution, geometry and algorithms. It is also explained the Space Surface Bi-static Radar and within two particular cases: parallel and non parallel paths

  • 33.
    Ajayi, Taiwo Seun
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Mobile Satellite Communications: Channel Characterization and Simulation2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Abstract: The channel characterization of a mobile satellite communication which is an important and fast growing arm of wireless communication plays an important role in the transmission of information through a propagation medium from the transmitter to the receiver with minimum barest error rate putting into consideration the channel impairments of different geographical locations like urban, suburban, rural and hilly. The information transmitted from satellite to mobile terminals suffers amplitude attenuation and phase variation which is caused by multipath fading and signal shadowing effects of the environment. These channel impairments are commonly described by three fading phenomena which are Rayleigh fading, Racian fading and Log-normal fading which characterizes signal propagation in different environments. They are mixed in different proportions by different researchers to form a model to describe a particular channel. In the thesis, the general overview of mobile satellite is conducted including the classification of satellite by orbits, the channel impairments, the advantages of mobile satellite communication over terrestrial. Some of the major existing statistical models used in describing different type of channels are looked into and the best out of them which is Lutz model [6] is implemented. By simulating the Lutz model which described all possible type of environments into two states which represent non-shadowed or LOS and shadowed or NLOS conditions, shows that the BER is predominantly affected by shadowing factor.

  • 34.
    Akber, Raza
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Raza, Syed Aqeel
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Shafique, Usman
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Performance Evaluation of WiMAX2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The advancements in broadband and mobile communication has given many privileges to the subscribers for instance high speed data connectivity, voice and video applications in economical rates with good quality of services. WiMAX is an eminent technology that provides broadband and IP connectivity on “last mile” scenario. It offers both line of sight and non-line of sight wireless communication. Orthogonal frequency division multiple access is used by WiMAX on its physical layer. Orthogonal frequency division multiple access uses adaptive modulation technique on the physical layer of WiMAX and it uses the concept of cyclic prefix that adds additional bits at the transmitter end. The signal is transmitted through the channel and it is received at the receiver end. Then the receiver removes these additional bits in order to minimize the inter symbol interference, to improve the bit error rate and to reduce the power spectrum. In our research work, we investigated the physical layer performance on the basis of bit error rate, signal to noise ratio, power spectral density and error probability. These parameters are discussed in two different models. The first model is a simple OFDM communication model without the cyclic prefix, while the second model includes cyclic prefix.

  • 35.
    Akhtar, Jawad
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Virtual reality: Effective surroundings, Enormous demonstration and mediator system in the games, industrial design and manufacturing2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In this thesis, the concept of virtual reality has been elaborated in the context of games, industrial design and manufacturing. The main purpose of this master’s thesis is to create a virtual environment for games that are near to the reality and according to the human nature through aspects like better interface, simulation, lights, shadow effects and their types. The importance of these aspects regarding realistic virtual environment is complemented through the comparison between two environments i.e. desktop and CAVE on a flight simulation program.

  • 36.
    Alam, Md. Khorshed
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Linear Unequal Error Protection for Region of Interest Coded Images over Wireless Channels2005Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In this thesis, an unequal error protection scheme for transmitting JPEG2000 images over wireless channel is investigated. The rapid growth of wireless communication has resulted in a demand for robust transmission of compressed images over wireless networks. The challenge of robust transmission is to protect the compressed image data against the impairments of the radio channel, in such a way as to maximize the received image quality. However, for highly compressed images, it would be beneficial that regions of interest (ROI) are prioritized for interpretability. The thesis addresses this problem; investigating unequal error protection for transmitting JPEG2000 compressed images. More particularly, the results reported in this thesis provide guidance concerning the implementation of stronger error correction coding schemes (Golay code) for ROI and comparatively weaker coding (Hamming code) for non-ROI image spaces. Such unequal error protection can be utilized by the base station for transmitting JPEG2000-encoded images over next generation wireless networks.

  • 37.
    Allblom, Viktor
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Evaluating Agent Strategies for the TAC Supply Chain Management Competition2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The TAC Supply Chain Management game was designed to capture many of the challenges involved in dynamic supply chain practices. To evaluate the game I created four different agents, which operate according to simple but very different strategies. In addition, an advanced agent was created to see if the game was advanced enough not to be dominated by simple strategies. While the game is advanced enough to resist simple strategies, it is so simplified that it will never help solve any real world problems unless it is expanded to include more factors/problems of supply chain management.

  • 38.
    Andersson, Adam
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Watti, Alan
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Cupsystemet: En kvalitativ fallstudie av en mobil webbtjänst2004Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    The thesis describes the following main question: Will the administrative labour during football tournaments be made easier with the help from a mobile web service or will it convey to unnecessary extra work? To be able to give a good and qualified answer to this question, the thesis will in the beginning describe how the technique for such a system might look like and also describe the existing administrative moments during a football tournament, this is too give the reader a deeper understanding for further reading. The thesis then goes on to describe a concrete system which is tested on three football tournaments. On the basis of the tests, interviews and through analysis of these the thesis will be able to answer our questions. The result which we present in the thesis have unfolded through testing of the system on three chosen football tournaments. The differences in way of labour pre and post the system have been analyzed and through interviews the value of these changes have been assessed. The results achieved in the thesis are as follows, the presented mobile web service does not only decrease the total work effort made by the tournament officials it also speeds up their work. With that information emerged we can conclude that the administrative work which occurs during a football tournament is made easier and it does not lead to any unnecessary extra work.

  • 39.
    Andersson, Björn
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Persson, Marie
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Software Reliability Prediction – An Evaluation of a Novel Technique2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Along with continuously increasing computerization, our expectations on software and hardware reliability increase considerably. Therefore, software reliability has become one of the most important software quality attributes. Software reliability modeling based on test data is done to estimate whether the current reliability level meets the requirements for the product. Software reliability modeling also provides possibilities to predict reliability. Costs of software developing and tests together with profit issues in relation to software reliability are one of the main objectives to software reliability prediction. Software reliability prediction currently uses different models for this purpose. Parameters have to be set in order to tune the model to fit the test data. A slightly different prediction model, Time Invariance Estimation, TIE is developed to challenge the models used today. An experiment is set up to investigate whether TIE could be found useful in a software reliability prediction context. The experiment is based on a comparison between the ordinary reliability prediction models and TIE.

  • 40.
    Andersson, Fredrik
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Hagström, Stefan
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Dynamic identities for flexible access control2005Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This thesis will analyse the pros and cons of a module-based approach versus the currently existing certificate schemes and the proposed requirements for a module-based certificate scheme to serve as a plausible identity verification system. We will present a possible model and evaluate it in respect to the existing solutions and our set of identified requirements.

  • 41. Andrén, Linus
    Active suppression of vibration and noise in industrial applications2004Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Today, active control technology is about to emerge from the research labs into products in various areas. It has become an attractive method where passive techniques have low impact at low frequencies and adding active control to that part is often an attractive solution. The active control technique has been enabled by the rapid development of digital signal processors over the last decades. The focal point in this thesis is active vibration and noise suppression. Two different industrial applications have been subjected to active control to reduce unwanted disturbances. In cutting operations, active vibration suppression has been applied to both external turning and boring operations with successful results. Turning operations, and in particular boring operations, are typical examples of chatter prone machining. In order to implement active vibration control in boring operations a thourough investigation of the boring process has been made in the first two parts in this thesis. The following two parts of the thesis treat active vibration suppression in external turning operations and in boring operations. The second industrial application treats the noise in a fork-lift truck. In the final part of the thesis, active noise suppression has been implemented in the cabin of a fork-lift truck.

  • 42. Andrén, Linus
    et al.
    Håkansson, Lars
    Active Vibration Control of Boring Bar Vibrations2004Report (Other academic)
    Abstract [en]

    The boring operation is a cumbersome manufacturing process plagued by noise and vibration-related problems. A deep internal boring operation in a workpiece is a classic example of chatter-prone machining. The manufacturing industry today is facing tougher tolerances of product surfaces and a desire to process hard-to-cut materials; vibrations must thus be kept to a minimum. An increase in productivity is also interesting from a manufacturing point of view. Penetrating deep and narrow cavities require that the dimensions of the boring bar are long and slender. As a result, the boring bar is inclined to vibrate due to the limited dynamic stiffness. Vibration affects the surface finish, leads to severe noise in the workshop and may also reduce tool life. This report presents an active control solution based on a standard boring bar with an embedded piezo ceramic actuator; this is placed in the area of the peak modal strain energy of the boring bar bending mode to be controlled. An accelerometer is also included in the design; this is mounted as close as possible to the cutting tool. Embedding the electronic parts not only protects them from the harsh environment in a lathe but also enable the design to be used on a general lathe as long as the mounting arrangements are relatively similar. Three different algorithms have been tested in the control system. Since the excitation source of the original vibrations, i.e. the chip formation process cannot be observed directly, the algorithms must be constructed on the basis of a feedback approach. Experimental results from boring operations show that the vibration level can be reduced by 40 dB at the resonance frequency of a fundamental boring bar bending mode; several of its harmonics can also be reduced significantly.

  • 43. Andrén, Linus
    et al.
    Håkansson, Lars
    Brandt, Anders
    Claesson, Ingvar
    Identification of Dynamic Properties of Boring Bar Vibrations in a Continuous Boring Operation2004In: Mechanical systems and signal processing, ISSN 0888-3270, E-ISSN 1096-1216, Vol. 18, no 4, 869-901 p.Article in journal (Refereed)
    Abstract [en]

    Vibrations in internal turning operations are usually a cumbersome part of the manufacturing process. This article focuses on the boring bar vibrations. Boring bar vibrations in alloyed steel, stainless steel and cast iron have been measured in both the cutting speed direction and the cutting depth direction with the aid of accelerometers. The dynamic response of a boring bar seem to be a time varying process that exhibits non-linear behaviour. The process is influenced by non-stationary parameters that are not under the control of the operator or experimenter. The vibrations are clearly dominated by the first resonance frequency in one of the two directions of the boring bar. The problem with force modulation in rotary machinery, which appears as side band terms in the spectrum, is also addressed. Furthermore, the resonance frequencies of the boring bar are correlated to an Euler-Bernoulli beam model.

  • 44. Andrén, Linus
    et al.
    Håkansson, Lars
    Brandt, Anders
    Claesson, Ingvar
    Identification of Motion of Cutting Tool Vibration in a Continuous Boring Operation: Correlation to structural Properties2004In: Mechanical systems and signal processing, ISSN 0888-3270, E-ISSN 1096-1216, Vol. 18, no 4, 903-27 p.Article in journal (Refereed)
    Abstract [en]

    The internal turning operation has a history of being a cumbersome metal working process as vibration in boring operations is usually inevitable. In this article, the deflection shapes and/or mode shapes as well as the resonance frequencies of a boring bar have been put under scrutiny. Three methods have been used in order to investigate dynamic properties of a clamped boring bar: a theoretical Euler-Bernoulli beam model, an experimental modal analysis and an operating deflection shape analysis. \\ The results indicate a correlation between the shapes of the deflection shapes and/or mode shapes produced by the three different analysis methods. On the other hand, the orientation of the forced deflection shapes and/or mode shapes and the resonance frequencies demonstrates differences between the three methods. During continuous cutting, it is demonstrated that the bending motion of the first two resonance frequencies is to a large extent in the cutting speed direction.

  • 45. Andrén, Linus
    et al.
    Håkansson, Lars
    Claesson, Ingvar
    Actuator placements and Variations in the Control Path estimates in the Active Control of Boring Bar Vibrations2004Conference paper (Refereed)
    Abstract [en]

    A classical example of chatter prone machining is the boring operation. Turning under conditions with high vibrations in the cutting tool deteriorates the surface finish and may cause tool breakage. Severe noise is also a consequence of the high vibration levels in the boring bar. Active control is one possible solution to the noise and vibration problem in boring operations. In boring operations the boring bar usually have vibration components in both the cutting speed and the cutting depth direction. The introduction of the control force in different angles in between the cutting speed and the cutting depth directions have been investigated. Furthermore, control path estimates produced when the active boring bar was not in contact with the workpiece and during continuous cutting operation are compared. Experimental results indicate that the control force should be introduced in the cutting speed direction. Although the vibrations are controlled in just the cutting speed direction the vibrations in the cutting depth direction are also reduced significantly.

  • 46. Andrén, Linus
    et al.
    Johansson, Sven
    Winberg, Mathias
    Claesson, Ingvar
    Active Noise Control Experiments in a Fork-lift Truck Cabin2004Conference paper (Refereed)
    Abstract [en]

    High comfort for the driver in working vehicles is an important feature as well as a demand from the drivers. Low noise level is an essential factor for the manufacturer to maintain a high standard and comfort of vehicles. In many cases the noise inside the cabin can be related to the engine orders. Hydraulic pumps and fans are also related to the engine but not necessarily integers of the engine order. Passive absorbers are not suitable for the lowest frequencies and one approach is to use an active noise control system to solve the noise problem at low frequencies. In the present experiment loudspeakers were mounted inside the cabin of a fork lift-truck to produce the secondary noise field. To sense the residual noise, microphones were installed close to the driver's head. The aim is to create a zone of reduced noise around the head. Since a large portion of the noise inside the cabin can be related to the engine, an active control system based on a feedforward solution is possible. Experimental results from a feedforward solution of active noise control in a fork-lift truck cabin show that the noise level in the low frequency region can be reduced significantly.

  • 47.
    Angulo, Julio
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    The Emotional Driver: A Study of the Driving Experience and the Road Context2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    In modern societies the activity of driving has become almost an essential routine. Vehicles are considered by many as indispensable tools for accomplishing their daily tasks and they are the main form of transportation for millions of people. The average driver spends, voluntarily, considerable amounts of time on the road, using their vehicle to transport himself even for small distances and knowing that its use presents him with some form of comfort and convenience; yet, drivers frequently regard their road experience as tiring and fastidious, but their persistence in using their vehicle at every opportunity serves as proof of a pleasurable experience. So far car manufacturers, traffic authorities and designers of technology have been mainly concerned with aspects of the road that ensure drivers safety, increase power engine, provide more comfort, and maintain better streets, etc; however, the actual feelings of the driver as he travels through the streets has not yet been taken into a great account by the developers of the road environment. For this reason this thesis tries to create awareness on the existence and constant presence of people’s emotions as they drive, which have the mutual power to influence their action on the road and their driving patterns. In order to capture a drivers’ emotional experience this study uses three main methods. One of them is Cultural Probes, consisting of common objects specifically Postcards, Pictures, and Web-logs, to measure unknown factors about the users. The second is the use of Ethnographic studies on the driving activities through the use of observations, the popular talk-aloud-protocol and the shadow method. Finally, the Experience Sampling Method is used, which tries to captures the experience of an individual as it unfolds in its natural context. With the combined used of these three methods some of the main factors of the road’s environment that are commonly able to influence the driver’s emotions in negative or positive ways were discovered, which include the intensity and type of light, the different types and sources of sound, the perceivable landscapes and surrounding architectures and the different kinds of continuously occurring interactions. These are just some of the many factors that can influence emotions on the road, and hopefully this study will open the curiosity for a deeper study of these and other aspects of the emotional driving experience.

  • 48.
    Antonsson, Roger
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Petterson, Lena
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Being at one with the tool: applying flow to usability2005Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Communication between people has become more and more important in society today and so has the way to communicate. Our work, which this master thesis is based upon, has been to evaluate and redesign an existing web application that works like a communication tool. To carry out this work we have compiled two questions; how to facilitate the interaction for an application that is used as a tool, focusing on interface design, usability and flow; how can the usability be improved in a system, with help of flow theory. To deal with these two questions we have used a number of methods that have had different kinds of influence of our work. The one that has had the greatest impact of the work with the evaluation has been cognitive walkthrough. For the design we have used literature studies along with the result of the evaluation. A problem during our work has been that the user has not been specified the design should work at a generic group of users. The problem has not been to define the target group rather to suit the interface to everybody. This has been the challenge of this semester and we found designing an interface infusing usability with help from flow as interesting.

  • 49.
    Antonsson, Roger
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Petterson, Lena
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Think big: for small - infusing confidence, security and trustworthiness for mobile services2004Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    The use of mobile telephony has over the past years increased and consequently has the development of services over the mobile phone also increased. This semester we have taken part in a large system development project, our contributions have been designing the graphical user interfaces. In doing that we found the problem with how to mediate trust to a user through a graphical user interface interesting. In this thesis we are focusing on how to develop graphical user interfaces for a mobile phone service that radiate and infuse confidence, security and trustworthiness. In order to attain the purpose, we have used the combination of literature studies and to some extent user involvement with Mock-ups and a Think-aloud technique. We are also describing the importance of taking as well usability and usability goals as the needs for the end users into consideration. We have found that more research on how to radiate and infuse trust through a graphical user interface is needed. This thesis is concluded with some aspects on that subject that we think is important to have in mind. It is of great importance to never leave the user in a state of uncertainty and therefore is clear, sincere and informative feedback necessary throughout the service. Also central in designing graphical user interfaces is to make sure that there is no mismatch in the security of the system and the radiated security.

  • 50.
    Appana, Dileep Kumar
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Kumar, Chinni Anil
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Nagappan, Nagappan Palaniappan
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    Channel Estimation in GPRS based Communication System using Bayesian Demodulation.2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    With the increase use of portable devices such as Personal Digital Assistants (PDA), laptops, voice and data integrated cell phones and many more, there is a need of wireless communication method using air as the medium to transmit and receive information between terminals. Radio waves propagate from transmitting antenna and travel through free space undergoing reflections, diffractions and scattering. They are greatly affected by ground terrain, the atmosphere and the objects in their path like buildings, bridges, hills etc. Nowadays, the existence of a direct line of sight path between the transmitter and the receiver is unlikely. These multiple phenomena are responsible for most of the characteristic features like the quality of the received signal. In the above case propagation is mainly due to reflection and cattering from the buildings and by diffraction. So, in practice the transmitted signal arrives at the receiver via several paths with different time delays creating a multi path situation at the receiver, these multipath waves with randomly distributed amplitudes and phases combine to give a resultant signal that fluctuates in time and space. This phenomenon of random fluctuations in received signal level is termed as fading. The existing demodulation techniques like FM, AM will determine the signal from the received signal based on the mean distance method, which cannot provide the desired level of BER, which fails in proper estimation under high fading and high Doppler-Shift effect. SOLUTION: This project provides the implementation of an enhancement to the demodulation technique using Bayesian approach for the physical layer simulation of a General Packet Radio System (GPRS) considering variable Rician fading and variable Doppler-Shift effect for an AWGN channel. The system performance is evaluated based on Bit Error Rate (BER) and Signal to Noise Ratio (SNR) for the realized GPRS system. Matlab platform is used for the implementation, analysis of the proposed system with for functional verification in terms of BER and SNR. We have showed the comparative difference between the theoretical calculation of QPSK signal and to the values obtained by our program. The values show difference up to 0.4 db for a 1000 bit random vector. Moreover, we also compared with QAM demodulation technique in MATLAB code to show difference up to 1.4 db for a 1000 bit vector. These results signify better performance of the system as it has saved bandwidth.

1234567 1 - 50 of 1407
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf