Change search
Refine search result
78910111213 451 - 500 of 1407
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 451.
    Hagelbäck, Johan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Johansson, Stefan J.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    The Rise of Potential Fields in Real Time Strategy Bots2008Conference paper (Refereed)
    Abstract [en]

    Bots for Real Time Strategy (RTS) games are challenging to implement. A bot controls a number of units that may have to navigate in a partially unknown environment, while at the same time search for enemies and coordinate attacks to fight them down. Potential fields is a technique originating from the area of robotics where it is used in controlling the navigation of robots in dynamic environments. We show that the use of potential fields for implementing a bot for a real time strategy game gives us a very competitive, configurable, and non-conventional solution.

  • 452.
    Hagelbäck, Johan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Johansson, Stefan J.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Using Multi-agent Potential Fields in Real-time Strategy2008Conference paper (Refereed)
    Abstract [en]

    Bots for Real Time Strategy (RTS) games provide a rich challenge to implement. A bot controls a number of units that may have to navigate in a partially unknown environment, while at the same time search for enemies and coordinate attacks to fight them down. Potential fields is a technique originating from the area of robotics where it is used in controlling the navigation of robots in dynamic environments. Although attempts have been made to transfer the technology to the gaming sector, assumed problems with efficiency and high costs for implementation have made the industry reluctant to adopt it. We present a Multi-agent Potential Field based bot architecture that is evaluated in a real time strategy game setting and compare it, both in terms of performance, and in terms of softer attributes such as configurability with other state-of-the-art solutions. Although our solution did not reach the performance standards of traditional RTS bots in the test, we see great unexploited benefits in using multi-agent potential field based solutions in RTS games.

  • 453.
    HAIDER, KAMRAN
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Peak to Average Ratio Reduction in Wireless OFDM Communication Systems2006Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Future mobile communications systems reaching for ever increasing data rates require higher bandwidths than those typical used in today’s cellular systems. By going to higher bandwidth the (for low bandwidth) flat fading radio channel becomes frequency selective and time dispersive. Due to its inherent robustness against time dispersion Orthogonal Frequency Division Multiplex (OFDM) is an attractive candidate for such future mobile communication systems. OFDM partitions the available bandwidth into many subchannels with much lower bandwidth. Such a narrowband subchannel experiences now due to its low bandwidth an almost flat fading leading in addition to above mentioned robustness also to simple implementations. However, one potential drawback with OFDM modulation is the high Peak to Average Ratio (PAR) of the transmitted signal: The signal transmitted by the OFDM system is the superposition of all signals transmitted in the narrowband subchannels. The transmit signal has then due to the central limit theorem a Gaussian distribution leading to high peak values compared to the average power. A system design not taking this into account will have a high clip rate: Each signal sample that is beyond the saturation limit of the power amplifier suffers either clipping to this limit value or other non-linear distortion, both creating additional bit errors in the receiver. One possibility to avoid clipping is to design the system for very high signal peaks. However, this approach leads to very high power consumption (since the power amplifier must have high supply rails) and also complex power amplifiers. The preferred solution is therefore to apply digital signal processing that reduces such high peak values in the transmitted signal thus avoiding clipping. These methods are commonly referred to as PAR reduction. PAR reduction methods can be categorized into transparent methods – here the receiver is not aware of the reduction scheme applied by the transmitter – and non-transparent methods where the receiver needs to know the PAR algorithm applied by the transmitter. This master thesis would focus on transparent PAR reduction algorithms. The performance of PAR reduction method will be analysed both with and without the PSD constrained. The effect of error power on data tones due to clipping will be investigated in this report.

  • 454. Haider, Kamran
    et al.
    Mohammed, Abbas
    Baldmair, Robert
    Performance Evaluation of the Active Set Algorithm for Peak to Average Ratio Reduction in Wireless OFDM Communication Systems2006Conference paper (Refereed)
    Abstract [en]

    For high speed data transmission over multipath fading channels Orthogonal Frequency Division Multiplexing (OFDM) can be a spectrally efficient multicarrier modulation technique. All OFDM systems suffer from large peak-to-average-power ratio (PAR), which can lead to power inefficiency and nonlinear distortion at the RF portion of the transmitter. Tone reservation method uses reserved tones to generate a peak-canceling signal to lower the PAR of a transmit OFDM block. This paper investigates the recently proposed active-set tone reservation method, for complex-baseband signals, to reduce the PAR in wireless and broadcast systems. When PAR reduction is applied on an OFDM symbol and its PAR level still exceeds the target PAR after reduction this OFDM symbol suffers clipping. Infrequency domain clipping energy spreads across all tones and acts as error power. Simulations are performed to investigate the error power on data tones after active-set method.

  • 455.
    Haider, Muhammad Abdur Rahman
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Bhatti, Abu Bakar
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Kirmani, Ammar Ahmad
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Radio Resource Management In 3G UMTS Networks2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Universal Mobile Telecommunication System (UMTS) is a third generation mobile communication system, designed to support a wide range of applications with different quality of service (QoS) profiles. This 3G system has capability of transporting wideband and high bit rate multimedia services along with traditional cellular telephony services e.g voice, messaging etc. To provide these services with better quality of service and enhance the performance of wireless network, management of radio resources is necessary. To do this, UMTS offer many radio resource management (RRM) strategies. These RRM techniques play important role in providing different services with better quality, keep the end user happy and make the network stable. In our thesis, our main objective is to explore some RRM strategies and understand their practical importance by simulating some RRM algorithms. First we start with UMTS overview and learn some important concept about UMTS architecture. Then we go deep into physical layer of UMTS. After getting strong concept of UMTS radio architecture and procedures, we worked on different RRM techniques and in the end we analyze two power control algorithms to understand and get some practical experience of actual RRM strategies, because power control is the important most and critical part of RRM techniques due to interference limited nature of CDMA systems.

  • 456.
    Hameed, Faysal
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Ejaz, Mohammad
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Model for conflict resolution in aspects within Aspect Oriented Requirement engineering2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Requirement engineering is the most important phase within the software development phases since it is used to extract requirements from the customers which are used by the next phases for designing and implementation of the system. Because of its importance, this thesis focuses on the term aspect oriented requirement engineering, which is the first phase in aspect oriented software development used for the identification and representation of requirements gathered in the form of concerns. Besides the overall explanation of aspect oriented requirement engineering phase, detail attention is given to a specific activity within AORE phase called conflict resolution. Several techniques proposed for conflict resolution between aspects is discussed along with an attempt to give a new idea in the form of an extension of the already proposed model for conflict resolution. The need for extension to the already proposed model is justified by the use of a case study which is applied on both the models i.e. on the original model and on the extended model to compare the results.

  • 457.
    Hameed, Muhammad Muzaffar
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Haq, Muhammad Zeeshan ul
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    DefectoFix: An interactive defect fix logging tool.2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Despite the large efforts made during the development phase to produce fault free system, most of the software implementations still require the testing of entire system. The main problem in the software testing is the automation that could verify the system without manual intervention. Recent work in software testing is related to the automated fault injection by using fault models from repository. This requires a lot of efforts, which adds to the complexity of the system. To solve this issue, this thesis suggests DefectoFix framework. DefectoFix is an interactive defect fix logging tools that contains five components namely Version Control Sysem (VCS), source code files, differencing algorithm, Defect Fix Model (DFM) creation and additional information (project name, class name, file name, revision number, diff model). The proposed differencing algorithm extracts detailed information by detecting differences in source code files. This algorithm performs comparison at sub-tree levels of source code files. The extracted differences with additional information are stored as DFM in repository. DFM(s) can later be used for the automated fault injection process. The validation of DefectoFix framework is performed by a tool developed using Ruby programming language. Our case study confirms that the proposed framework generates a correct DFM and is useful in automated fault injection and software validation activities.

  • 458.
    Hamid, Mohamed
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    DYNAMIC SPECTRUM ACCESS IN COGNITIVE RADIO NETWORKS: ASPECTS OF MAC LAYER SENSING2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Over the past two decades wireless communication systems have been showing great revolution and rapid growth. Therefore, the standardization agencies together with wireless researchers and industry have been working on new specifications and standards to face the high demand for wireless communication systems. One of the most critical issues regarding wireless networks regulation agencies and researchers are thinking about is how to manage the available electromagnetic radio spectrum in a way that satisfies the needs of these growing wireless systems both economically and technically especially with the recent crowding in the available spectrum. Hence, building cognitive radio systems support dynamic access to the available spectrum has appeared recently as a novel solution for the wireless system huge expansion. In this thesis we investigate the MAC layer sensing schemes in cognitive radio networks, where both reactive and proactive sensing are considered. In proactive sensing the adapted and non-adapted sensing periods schemes are also assessed. The assessment of the pre-mentioned sensing schemes has been held via two performance metrics, achieved spectrum utilization factor and idle channel search delay. The simulation results show that with proactive sensing adapted periods we achieve the best performance but with an observable over head computational tasks to be done by the network nodes which reflects the extent of complexity we need in our network nodes. On the other hand reactive sensing is the simplest sensing schemes with the worst achieved performance.

  • 459. Hamid, Mohamed
    et al.
    Mohammed, Abbas
    MAC Layer Sensing Schemes in Cognitive Radio Networks2009Conference paper (Refereed)
    Abstract [en]

    In this paper we investigate the MAC layer sensing schemes in cognitive radio. Reactive and proactive sensing are both considered and assessed. The assessment of these sensing has been held via the obtained idle channel search delay in each case. The simulation results conclude that with proactive sensing we achieve lower idle channel search delay as compared with reactive sensing, but on the other hand we waste some of our resources in sensing which is higher than in reactive scheme. Thus, in proactive sensing we have observable computational tasks to be done by the network nodes.

  • 460. Hansson, Christina
    et al.
    Dittrich, Yvonne
    Randall, Dave
    Agile Processes Enhancing User Participation for Small Providers of Off-the-Shelf Software2004Conference paper (Refereed)
    Abstract [en]

    To survive in todayrsquos competitive software market, software developers must maintain contact with their customers and users and adopt a flexible organization which allows response to feedback and the changing requirements from the use-context. This also requires a software development that enables change proposals and error reports to be acted upon quickly. The present article uses a case study of a flexible development practice which so far has proved to be sustainable and successful to reconsider user involvement and software development practices of small software providers from an agile perspective. Implementing an agile process may allow for competitive flexibility without necessarily jeopardizing quality.

  • 461. Hansson, Christina
    et al.
    Dittrich, Yvonne
    Randall, Dave
    How to include users in the development of off-the-shelf software: A case for complementing participatory design with agile development2006Conference paper (Refereed)
    Abstract [en]

    This paper describes and discusses a non-traditional approach to participatory design, one which is combined with an agile-like software development process. In this case, the size of the company combined with a distributed population of users has a serious impact on the software development process. The small software company in our study resolves this problem with an unconventional amalgam of participatory design and agile processes which seems to suit their situation. By using different kinds of user participation the small software provider is able to keep in contact with users on a daily basis. Users convey requirements for new functionalities, give feedback and report errors. Users' feedback and proposals form the basis for further development. The paper relates our observations to other research on participatory design in unconventional settings and discusses the conditions under which agile software development can complement participatory design.

  • 462. Hansson, H
    et al.
    Blomstrand, F
    Khatibi, Siamak
    Olsson, T
    Glutamate induced astroglial swelling1997In: On astrocytes and glutamate neurotransmission: New waves in brain information processing, Neuroscience Intelligence Unit, Springer, R.G. Landes Company , 1997, p. 106-120Chapter in book (Refereed)
  • 463.
    Haque, S.M. Rafizul
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Singular Value Decomposition and Discrete Cosine Transform based Image Watermarking2008Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Rapid evolution of digital technology has improved the ease of access to digital information enabling reliable, faster and efficient storage, transfer and processing of digital data. It also leads to the consequence of making the illegal production and redistribution of digital media easy and undetectable. Hence, the risk of copyright violation of multimedia data has increased due to the enormous growth of computer networks that provides fast and error free transmission of any unauthorized duplicate and possibly manipulated copy of multimedia information. One possible solution may be to embed a secondary signal or pattern into the image that is not perceivable and is mixed so well with the original digital data that it is inseparable and remains unaffected against any kind of multimedia signal processing. This embedded secondary information is digital watermark which is, in general, a visible or invisible identification code that may contain some information about the intended recipient, the lawful owner or author of the original data, its copyright etc. in the form of textual data or image. In order to be effective for copyright protection, digital watermark must be robust which are difficult to remove from the object in which they are embedded despite a variety of possible attacks. Several types of watermarking algorithms have been developed so far each of which has its own advantages and limitations. Among these, recently Singular Value Decomposition (SVD) based watermarking algorithms have attracted researchers due to its simplicity and some attractive mathematical properties of SVD. Here a number of pure and hybrid SVD based watermarking schemes have been investigated and finally a RST invariant modified SVD and Discrete Cosine Transform (DCT) based algorithm has been developed. A preprocessing step before the watermark extraction has been proposed which makes the algorithm resilient to geometric attack i.e. RST attack. Performance of this watermarking scheme has been analyzed by evaluating the robustness of the algorithm against geometric attack including rotation, scaling, translation (RST) and some other attacks. Experimental results have been compared with existing algorithm which seems to be promising.

  • 464. Hardemo, Isa
    On the practice of queuing and new forms of interaction2006Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The practice of queuing is a daily experience for most of us and it is usually difficult to combine it with other activities. This indicates that people involved in the act of queuing become a bit too occupied with maintaining one's position in the queue. Despite that queuing is a common phenomenon, queuing situations are now often equipped with aids based on numbers that help regulating the queuing order. Still, the practice of queuing includes several nuances of social interaction that demands careful attention from its participants for it to work. Based on cases and concepts with varying levels of viability, this thesis investigates the practice of queuing as a design space. The thesis further suggests how a more flexible queue could be designed. An overall aim is to examine how to provide greater action space for participants in a queue and enable for new forms of interaction. In order to queue from a distance, much of what traditionally constructs the queue is redesigned. To address these issues from a usability point of view, it is a challenge to create an interaction design that allows different ways of queuing, without deviating too much from features that are evaluated as decisive to maintain.

  • 465.
    Hariharan, Mahesh
    Blekinge Institute of Technology, School of Engineering, Department of Telecommunication Systems.
    P2P Networking and Technology Enablers in Business Applications2006Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The usage of Peer to Peer Networks over the Internet has been growing by exponentially. Apartbfrom the hype surrounding P2P, it has remarkable ramifications on the way the Internet could be used. This is an area which is not explored as well as we would want to. This thesis examines the architectural differences in P2P networks and generic application domains where the principles of P2P are exploited. The usage of P2P in different business verticals and technology enablers that go along with it are presented. The focus is on several case studies each addressing a different use case. The common thread running through all of these use cases is the ability to resolve a business issue. Finally the focus is on how P2P networks might probably change the way the Internet behaves in the near future.

  • 466. Heath, Christian
    et al.
    Luff, Paul
    Svensson, Marcus Sanchez
    Video and qualitative research: analysing medical practice and interaction2007In: Medical education, ISSN 1365-2929, Vol. 41, no 1, p. 109-116Article in journal (Refereed)
    Abstract [en]

    There has been a longstanding recognition that video provides an important resource within medical education particularly, perhaps, for training in primary health care. As a resource for research, and more specifically within qualitative social science studies of medical practice, video has proved less pervasive despite its obvious advantages. In this paper, we sketch an approach for using video to inform the analysis of medical practice and the ways in which health care is accomplished through social interaction and collaboration. Drawing on our own research we discuss two brief examples; one the use of computing technology in primary health care and secondly informal instruction during surgical operations. The examples illustrate the multimodal character of medical work, how activities are accomplished through the interplay of talk, the visual and the use of material artefacts. They also illustrate the ways in which video provides access to the complex forms of social interaction and collaboration that underpin health care. We reflect upon the research opportunities afforded by video and the ways in which video based studies of interaction can contribute to the practice and practicalities of medicine.

  • 467. Heath, Christian
    et al.
    Svensson, Marcus Sanchez
    Hindmarsh, Jon
    Luff, Paul
    Lehn, Dirk vom
    Configuring awareness2002In: Computer Supported Cooperative Work, ISSN 0925-9724, E-ISSN 1573-7551, Vol. 11, no 3-4, p. 317-347Article in journal (Refereed)
    Abstract [en]

    The concept of awareness has become of increasing importance to both social and technical research in CSCW. The concept remains however relatively unexplored, and we still have little understanding of the ways in which people produce and sustain ‘awareness’ in and through social interaction with others. In this paper, we focus on a particular aspect of awareness, the ways in which participants design activities to have others unobtrusively notice and discover, actions and events, which might otherwise pass unnoticed. We consider for example how participants render visible selective aspects of their activities, how they encourage others to notice features of the local milieu, and how they encourage others to become sensitive to particular events. We draw examples from different workplaces, primarily centres of coordination; organisational environments which rest upon the participants’ abilities to delicately interweave a complex array of highly contingent, yet interdependent activities.

  • 468. Heath, Christian
    et al.
    Svensson, Marcus Sanchez
    Luff, Paul
    Technology and Medical Practice2003In: Sociology of Health and Illness, ISSN 0141-9889, E-ISSN 1467-9566, Vol. 25, no 3, p. 75-96Article in journal (Refereed)
    Abstract [en]

    One of the most significant developments in healthcare over the past 25 years has been the widespread deployment of information and communication technologies. These technologies have had a wide-ranging impact on the organisation of healthcare, on professional practice and on patients’ experience of illness and its management. In this paper we discuss the ways in which Sociology of Health and Illness has provided a forum for the analysis of these new technologies in healthcare. We review a range of relevant research published in the Journal; papers that address such issues as dehumanisation and emotional labour, professional practice and identity, and the social and institutional shaping of technology. Despite these important initiatives, we suggest that information and communication technologies in healthcare remain relatively under-explored within the Journal and, more generally, by the sociology of health and illness and point to developments in cognate areas which may have some bearing upon the analysis of technology in action.

  • 469.
    Hedberg, Claes
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Science.
    Martinet, Philippe
    Non-linear dynamics in granular materials2007In: The universality of nonclassical nonlinearity, with applications to / [ed] Paolo, Pier, New York: Springer , 2007, p. 327-336Chapter in book (Other academic)
    Abstract [en]

    This book comes as a result of the research work developed in the framework of two large international projects: the European Science Foundation (ESF) supported program NATEMIS (Nonlinear Acoustic Techniques for Micro-Scale Damage Diagnostics) (of which Professor Delsanto was the European coordinator, 2000-2004) and a Los Alamos-based network headed by Dr. P.A. Johnson. The main topic of both programs and of this book is the description of the phenomenology, theory and applications of nonclassical Nonlinearity (NCNL). In fact NCNL techniques have been found in recent years to be extremely powerful (up to more than 1000 times with respect to the corresponding linear techniques) in a wide range of applications, including Elasticity, Material Characterization, Ultrasonics, Geophysics to Maintenance and Restoration of artifacts (paintings, stone buildings, etc.). The book is divided into three parts: Part I - defines and describes the concept of NCNL and its universality and reviews several fields to which it may apply; Part II - describes the phenomenology, theory, modelling and virtual experiments (simulations); Part III -discusses some of the most relevant experimental techniques and applications.

  • 470.
    Heiskanen, Jari
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    WLAN prestanda i IEEE 802.11n2008Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    WLAN/n är en attraktiv teknik då det ökar data genomströmningen och räckvidden med ökad effektivitet som ger mer bits per sekund. Utvecklingen inom 802.11n har ökat data raten och prestanda till högre nivåer. En teknik är att använda multipla antenner för sändare och mottagare för att sända multipla data strömmar genom MIMO system för n standarden. Mobila enheter i en cell kanske inte har fixerade positioner i cellen så utveckling av mer avancerade algoritmer som OFDM har också utvecklats och presenteras i den här rapporten. Interference för signalen är ett dilemma då den beräknade mängden av data tillverkare av wlan produkter har i sina specifikationer knappast stämmer med verkligheten när man mäter data genomströmning. Resultat från experimenten visar att störningar i vissa fall kan anses vara ett problem. Målet med den här rapporten är att besvara frågeställning kring olika tekniker inom WLAN teknologin och vilka typer av störningar och utveckling det finns för en stabilare och högre data genomströmning. Experimenten inkluderar WLAN nät i naturlig miljö med accesspunkt och laptop med nätverkskort.

  • 471. Hellman, Mats
    et al.
    Rönkkö, Kari
    Controlling User Experience through Policing in the Software Development Process2008Conference paper (Refereed)
  • 472. Hellman, Mats
    et al.
    Rönkkö, Kari
    Is User Experience supported effectively in existing software development processes?2008Conference paper (Refereed)
  • 473. Henesey, Lawrence
    A Multi Agent Based Simulator for Managing a Container Terminal2004Conference paper (Refereed)
  • 474. Henesey, Lawrence
    Application of Transaction Costs in Analyzing Transport Corridors Using Multi-Agent Based Simulation2006In: Promet Traffic & Transportation: Scientific Journal on Traffic and Transportation Research, ISSN 0353-5320, Vol. 18, no 2, p. 59-65Article in journal (Refereed)
  • 475. Henesey, Lawrence
    Enhancing Container Terminal Performance: A Multi Agent Systems Approach2004Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis seeks to understand and attempt to solve some of the problems that are facing many decision makers at container terminals. The growth of containerization, transporting goods in a container, has created many problems for ports, i.e. higher requirements on terminals and infrastructure. many container terminals are reaching their capacity limits and increasingly leading to traffic and port congestion. the focus of the research involves the performance from the container terminal manager´s perspective and how to improve the understanding of the factors of productivity and how they are related to each other.

  • 476. Henesey, Lawrence
    Multi-Agent Container Terminal Management2006Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis describes research concerning the application of multi-agent based simulation for evaluating container terminal management operations. The growth of containerization, i.e., transporting goods in a container, has created problems for ports and container terminals. For instance, many container terminals are reaching their capacity limits and increasingly leading to traffic and port congestion. Container terminal managers have several, often conflicting goals, such as serve a container ship as fast as possible while minimizing terminal equipment costs The focus of the research involves the performance from the container terminal manager’s perspective and how to improve the understanding of the factors of productivity and how they are related to each other. The need to manage complex systems such as container terminals requires new ways for finding solutions, e.g., by applying novel methods and technologies. The approach taken in this thesis is to model the decision makers involved in the container terminal operations and various types of terminal equipment, e.g., cranes, transporters, etc., as software agents. The general question addressed in this work is: can the performance of a container terminal be improved by using agent-based technologies? In order to evaluate the multi-agent based systems approach, a simulation tool, called SimPort, was developed for evaluating container terminal management policies. The methods for modelling the entities in a container terminal are presented along with the simulation experiments conducted. The results indicate that certain policies can yield faster ship turn-around times and that certain stacking policies can lead to improved productivity. Moreover, a multi-agent based simulation approach is used to evaluate a new type of Automated Guided Vehicles (AGVs) using a cassette system, and compare it to a traditional AGV system. The results suggest that the cassette-based system is more cost efficient than a traditional AGV system in certain configurations. Finally, an agent-based approach is investigated for evaluating the governance structure of the stakeholders involved in a transport corridor. The results of the research indicate that the performance of a container terminal can be improved by using agent-based technologies. This conclusion is based upon several studies, both conceptual and concrete simulation experiments. In particular, multi-agent based simulation seems to offer container terminal management a suitable tool to control, coordinate, design, evaluate and improve productivity.

  • 477. Henesey, Lawrence
    et al.
    Aslam, Khurum
    Khurum, Mahvish
    Task Coordination of Automated Guided Vehicles in a Container Terminal2006Conference paper (Refereed)
  • 478. Henesey, Lawrence
    et al.
    Davidsson, Paul
    Persson, Jan A.
    Agent Based Simulation Architecture for Evaluating Operational Policies in Transshipping Containers2006Conference paper (Refereed)
  • 479. Henesey, Lawrence
    et al.
    Davidsson, Paul
    Persson, Jan A.
    Agent Based Simulation Architecture for Evaluating Operational Policies in Transshipping Containers2006In: Multiagent System Technologies: 4th German Conference (MATES 2006), Springer , 2006, p. 73-85Chapter in book (Other academic)
  • 480. Henesey, Lawrence
    et al.
    Davidsson, Paul
    Persson, Jan A.
    Evaluating Container Terminal Transhipment Operational Policies: An Agent-Based Simulation Approach2006In: WSEAS Transactions on Computers, ISSN 1109-2750, E-ISSN 2224-2872, Vol. 5, no 9, p. 2090-2098Article in journal (Refereed)
  • 481. Henesey, Lawrence
    et al.
    Davidsson, Paul
    Persson, Jan A.
    Evaluation of Automated Guided Vehicle Systems for Container Terminals Using Multi Agent Based Simulation2009Conference paper (Refereed)
    Abstract [en]

    Due to globalization and the growth of international trade, many container terminals are trying to improve performance in order to keep up with demand. One technology that has been proposed is the use of Automated Guided Vehicles (AGVs) in the handling of containers within terminals. Recently, a new generation of AGVs has been developed which makes use of cassettes that can be detached from the AGV. We have developed an agent-based simulator for evaluating the cassette-based system and comparing it to a more traditional AGV system. In addition, a number of different configurations of container terminal equipment, e.g., number of AGVs and cassettes, have been studied in order to find the most efficient configuration. The simulation results suggest that there are configurations in which the cassette-based system is more cost efficient than a traditional AGV system, as well as confirming that multi agent based simulation is a promising approach to this type of applications.

  • 482. Henesey, Lawrence
    et al.
    Davidsson, Paul
    Persson, Jan A.
    Simulation of Operational Policies for Transhipment in a Container Terminal2006Conference paper (Refereed)
  • 483.
    Henesey, Lawrence
    et al.
    Blekinge Institute of Technology, School of Computing.
    Davidsson, Paul
    Blekinge Institute of Technology, School of Computing.
    Persson, Jan A.
    Blekinge Institute of Technology, School of Computing.
    Using Simulation in Evaluating Berth Allocation at a Container Terminal2004Conference paper (Refereed)
    Abstract [en]

    The operations and decision making at a container terminal have been simulated. A Berth Allocation Management System – (BAMS) has been built which consists of two parts: a container terminal simulator modelling the operations and a management simulator modelling the various actors involved in the allocation of container ships to berths. Together these two parts generate berth schedules for arriving container ships. Two berth assignment policies are evaluated in different scenarios, with various quay lengths, berth spacing lengths, and ship arrival sequences. The decisions in assigning ships with different loading and discharging demands to a limited amount of resources, such as berth space and cranes are analysed with the BAMS. The berths at the container terminal are modelled by the BAMS to be dynamic in the sense that berth segmentation is based on the current situation rather than being static. The policies are evaluated in terms of turn-around time and distance travelled by the straddle carriers. The simulation results indicate that an informed choice of berth assignment policy can provide better use of the available resources, e.g., by reducing turnaround time and/or distance travelled by the straddle carriers.

  • 484. Henesey, Lawrence
    et al.
    Kerckaert, Koen
    Prospects for Short Sea Shipping2004Conference paper (Refereed)
  • 485. Henesey, Lawrence
    et al.
    Young, M.
    Short Sea Shipping in the United States: Identifying the Prospects and Opportunities2006Conference paper (Refereed)
  • 486. Henningsson, Kennet
    A Fault Classification Approach to Software Process Improvement2005Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The research presented is motivated by the demand for process improvement for companies active within software development. High demands on software quality are a reality. At the same time, short development time and low effort consumption are required. This stresses the necessity for process improvement. Empirical research methods and close cooperation with the industry partner addressed the research challenge. The research presented in this thesis shows how the analysis of faults through fault classification can be used to determine suitable and required process improvements. Two alternatives are investigated. First, a lightweight approach, and second a fault classification approach targeting all faults. The suitability of the fault classification is stressed as well as the importance of assigning the correct fault class. The latter is determined by classifier agreement calculations. Additionally, the research proposes that the appropriate occasion for a correct fault classification is alleged to be when the fault is corrected. The research also introduces an approach to tailor the verification and validation process. The tailoring process suggested considers the functionality characteristics and the software entity complexity in terms of couplings. This is used to select the appropriate and efficient process for verification and validation.

  • 487. Henningsson, Kennet
    et al.
    Wohlin, Claes
    Monitoring Fault Classification Agreement in an Industrial Context2005Conference paper (Refereed)
    Abstract [en]

    Based on prior investigations and the request from a collaborative research partner, UIQ Technology, an investigation to develop an improved and more informative fault classification scheme was launched. The study investigates the level of agreement, a prerequisite for using a fault classification, between classifiers in an industrial setting. The method used is an experimental approach performed in an industrial setting for determining the agreement among classifiers facilitating for example Kappa statistics for determining the agreement. From the study it is concluded that the agreement within the industrial setting is higher than obtained in a previous study within an academic setting, but it is still in need of improvement. This leads to the conclusion that the experience within industry as well as the improved information structure in relation to the previous study aids agreement, but to reach a higher level of agreement, additional education is believed to be needed at the company.

  • 488. Henningsson, Kennet
    et al.
    Wohlin, Claes
    Risk-based Trade-off between Verification and Validation: An Industry-motivated Study2005Conference paper (Refereed)
    Abstract [en]

    and C. Wohlin, "", (PROFES05), , Oulu, Finland, 2005.

  • 489.
    Hermansson, Anders
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Nilsson, Kenneth
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Interaktivitet i webbapplikationer2005Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Som alla vet och som alla läst har användandet av Internet ökat explosionsartat de sista tio åren vilket har lett till att mängden information som nu finns tillgänglig på några sekunders avstånd in i framtiden är gigantiskt. Då fler och fler företag vill nå ut till den breda publik som Internet har, måste de skapa webbsajter på ett sådant sätt att de fångar besökarnas intresse snabbt och får dem att göra återbesök. Detta gör givetvis konkurrensen om kunderna stenhård. Hur kan man då fånga intresset hos besökare för att få en större chans att påverka dem positivt och det snabbt? Den här rapporten riktar sig framförallt till utvecklare av webbapplikationer som vill se vad som är avgörande för att försöka locka till sig fler besökare och för att behålla de som redan finns. Undersökningen inriktar sig på hur olika grad av interaktiva element påverkar besökares belåtenhet och förtroende. De olika graderna av interaktivitet var kontroll av fart och sekvens, kontroll av variabler, kontroll av transaktioner och kontroll av objekt. För att kunna undersöka de olika graderna av interaktivitet tog vi fram tre olika versioner av en webbsajt. Dessa hade samma färger, dimensioner och andra delar var exakt samma i de tre olika versionerna. Endast graden av interaktion var olika. Resultaten vi fick fram visade att med ökad grad av interaktivitet i form av spel, gästbok och sökfunktion på företagets webbsajt kan skapa högre belåtenhet för besökarna. Användarnas förtroende för webbsajten visade sig dock vara mycket lika. Ett av svaren är något högre än de andra, men skillnaden är så marginell att vi inte kan dra någon slutsats av det. Sammanfattningsvis är vår uppfattning att desto fler element som en besökare kan påverka, desto bättre och positivare uppfattas sajten om det gäller en webbsajt som har en yngre generation som huvudbesökare.

  • 490. Hillgren, Per-Anders
    Ready-made-media-actions: Lokal produktion och användning av audiovisuella medier inom hälso- och sjukvården2006Doctoral thesis, comprehensive summary (Other academic)
    Abstract [sv]

    A growing global perspective and new technical infrastructure such as the internet give rise to expectations that knowledge and experiences could be shared and mediated between different contexts around the world. In line with this follows an increasing interest in standardization and context-independent ‘learning objects’ that allow content reusability across sites. This dissertation will focus on and argue for knowledge sharing with opposite qualities, where the specific context and the personal and local perspective instead will be central aspects. It's a knowledge sharing where “sender” and “receiver” are closely related and it's based on a socio-cultural perspective where knowledge, context, technology and mediation are deeply interconnected. The arguments are based on two practice based research projects, where interaction designers together with staff members at an intensive care unit and a hand surgery clinic collaboratively designed procedures where locally produced videos is used to enhance and develop the work practice in both these settings. The procedure differs from most ordinary movie production. It is not based on manuscripts or advanced planning, and it's without the more “objective” character common in instruction movies. Digital video technology is rather used to capture a situated and always changing practice, in which staff members film each other in their everyday practice. Making the movies where the work usually gets done helps practitioners elicit what should be told in the movies; what needs to be shown, named and forgrounded. The movies could be about “how to handle medical equipment”, “how to treat a severe wound” or “an articulation of a patient's specific situation and future rehabilitation”. The videos are based on “ready-made” actions already taking place in the everyday environment. Their character is informal and personal and they are later used as support for staff or patients with a close relation to the context. The local production makes it easy to adapt the content to changing circumstances, but it also allows staff members to get a view of how other colleagues perform their everyday work. This creates good opportunities for them to reflect on what they are doing and how their daily work could be improved. In addition to the reflections regarding video production, the PhD thesis will also focus on Participatory Design (PD) and the implications of close collaboration with users. PD is often considered not to lead towards the more innovative and only benefit incremental design processes. In the thesis, arguments will be presented that close PD instead could be based on an approach where designers challenge the users and conduct fruitful “collisions” with them and their environment. It could be “collisions” between values and perspectives, but also between design ideas and the real working context. This is achieved through experiments in the daily practice, where ideas encounter as much resistance as possible with the conflicting artifacts, people and ideas residing in the context.

  • 491.
    Hodzic, Armin
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Research in SAR reduction by changing composition of phone casing2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This thesis is a research conducted to find out if the specific absorption rate (SAR), imposed on the human head from cellular phones, can be reduced without having an significant impact on the overall performance and reception of the phone. The SAR values are influenced by various different variables like, type and size of the antenna, position of the antenna relative to human head, radiated power from the antenna, distance and angle from human head and finally the material covering the phone. The methodologies that will be used are to investigate three different approaches to SAR reduction. First approach is to investigate if the size and type of the material shielding the human head have significant impact on the SAR radiation, second approach is to investigate if the angle of the phone in relation to human head have significant impact and as the third approach is to investigate if it is possible to change the composition of the material which would ultimately lead to SAR reduction. Electric properties of the material are described by two variables conductivity and permittivity, and in the third part of the thesis I will change these two variables and then investigate how the SAR values change. Result will be presented as a 3D graph and it will directly present how SAR values relate to the material used as a phone casing. The distance and antenna variations will not be investigated. The distance values will be fixed to 2.5 cm from the human head and the antenna length will be quarter of the 900 MHz wave length. The investigations will be performed in a simulation program called FEMLAB and the results will be showed in text and figures.

  • 492. Holland, Ian
    et al.
    Zepernick, Hans-Jürgen
    Caldera, Manora
    Soft Combining for Hybrid ARQ2005In: Electronics Letters, ISSN 0013-5194 , Vol. 41, no 22, p. 1230-1231Article in journal (Refereed)
    Abstract [en]

    A soft combining approach utilising symbol-by-symbol maximum a posteriori probability decoding is proposed for hybrid automatic repeat request schemes. In comparison to an existing soft combining approach, significant reductions in post-decoding bit error rate can be obtained without sacrificing the throughput efficiency. This is achieved with the proposed method by accumulating the signal-to-noise ratio at the channel output on each additional retransmission, for use in calculating extrinsic log-likelihood ratios on subsequent decoding attempts.

  • 493.
    Holm, Kristian
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Hallgren, Morgan
    Blekinge Institute of Technology, School of Engineering, Department of Interaction and System Design.
    Hur hanterar institutioner en miljö med blandade Operativsystem?2007Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    The purpose of this thesis is to investigate if there are environments that use some kind of central authentication system within institutions/schools. The hypothesis is that an institution not using a form of central authentication service has more need for maintenance and as such is considered a higher cost for the organization. The gathering of data has been done through interviews with technical personal at Blekinge Tekniska Högskola and Linköpings Universitet. Based on the technical background of the authors, and the system limited to discussing Windows and UNIX operating systems, a discussion and analysis of the systems working today has been done, with emphasize on the hypothesis.

  • 494.
    Holmberg, Per
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Valet av grafiskt dokumentgränssnitt2005Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Grafiska dokumentgränssnitt används av de flesta datoranvändare. Det finns i huvudsak tre typer av dokumentgränssnitt; MDI (multiple document interface), SDI (single document interface) och TDI (tabbed document interface). I ena halvan av arbetet har dessa tre analyserats utifrån en rad faktorer med målet att se om någon modells fördelar väger över de andras och om man kan skapa några riktlinjer för när det är bäst att applicera respektive modell när man utvecklar en ny programvara. Analysen är baserad på litteratur samt kvalitativ enkätundersökning och programtest. Andra delen av arbetet har ägnats åt testa om det går att programmera så att slutanvändaren kan välja den modell han eller hon föredrar. Ett kodramverk har utvecklats för att bevisa att detta är möjligt. I grova drag har följande resultat kommit fram: Olika program kräver olika lösningar. SDI är den modell som fungerar i flest lägen. TDI är inte bara trendigt utan också väldigt omtyckt. MDI är oftast det sämsta valet av de tre, men står inte utan fördelar. Att utveckla en kodramverk som möjliggör för utvecklaren att effektivt skapa program där användaren kan välja mellan MDI, SDI och TDI är fullt möjligt.

  • 495. Holmgren, Johan
    Multi-Agent-Based Simulation and Optimization of Production and Transportation2008Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis addresses the integration of software agent technology, simulation and mathematical optimization within the domain of production and transportation. It has been argued that agentbased approaches and mathematical optimization can complement each other in the studied domain. These technologies have often been used separately, but the existing amount of literature concerning how to combine them is rather limited, especially in the domain of production and transportation. This domain is considered complex since; for instance, the decision making is characterized by many decision makers that are influencing each other. Also, problems in the domain are typically large and combinatorial. The transportation of goods has both positive and negative effects on society. A positive effect is the possibility for people to consume products that have been produced at distant locations. Examples of negative effects are: emissions, congestion, accidents, and large costs for infrastructure investments. Increasing competition, experienced by manufacturers and haulers, acts as a motivation for improving the utilization of often limited and expensive production and transportation resources. It is important to maximize the positive effects of transportation while the negative effects are minimized. We present a rather general agent-based simulator (TAPAS) for simulation of production and transportation. By using agent technology, we have been able to simulate the decision making and interaction between decision makers, which is difficult using traditional simulation techniques. We provide a technical description of how TAPAS was modeled, and examples of how it can be used. An optimization model for a real world ``Integrated Production, Inventory, and Distribution Routing Problem’’ (IPIDRP) has been developed. The identified IPIDRP is in the domain of production and transportation problems. For solving and analyzing the problem, we developed a solution method based on the principles of Dantzig- Wolfe decomposition, which was implemented as a multi-agent system inside TAPAS. The purpose is to improve resource utilization and to analyze the potential effects of introducing VMI (Vendor Managed Inventory). Experiments are performed for quantifying the benefits of VMI and for estimating the effects of an agentification of the decomposition approach. Some advantages and disadvantages of an agentification are discussed in this thesis. The work indicates high potentials for integrating agent technology and mathematical optimization. One direction for future work is to use TAPAS as a tool for evaluating the results that are produced by the optimization algorithm. For real world systems, evaluation of optimization results can be expensive and difficult to carry out, and we believe that simulation can be useful for evaluation purposes.

  • 496. Holmgren, Johan
    et al.
    Davidsson, Paul
    Persson, Jan A.
    Ramstedt, Linda
    An Agent Based Simulator for Production and Transportation of Products2007Conference paper (Refereed)
  • 497.
    Holmgren, Johan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Persson, Jan A.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Davidsson, Paul
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Agent Based Decomposition of Optimization Problems2008Conference paper (Refereed)
    Abstract [en]

    In this paper, we present an agent-based approach for solving an optimization problem using a Dantzig-Wolfe column generation scheme, i.e., a decomposition approach. It has been implemented and tested on an integrated production, inventory, and distribution routing problem. We developed a decomposition model for this optimization problem, which was implemented in the Java programming language, using the Java Agent DEvelopment Framework (JADE) and the ILOG CPLEX mixed integer linear problem solver. The model was validated on a realistic scenario and based on the results, we present estimates of the potential performance gain by using a completely distributed implementation. We also analyze the overhead, in terms of communication costs, imposed by an agent based approach. Further we discuss the advantages and the disadvantages that comes with an agent-based decomposition approach.

  • 498.
    Holmåker, Markus
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Woxblom, Magnus
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Performance evaluation of the fixed function pipeline and the programmable pipeline2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    When developing applications in Direct3D today, developers can choose between using the fixed function pipeline and the programmable pipeline. The programmable pipeline is more flexible than the fixed function pipeline, but what is the price for high flexibility? Is high flexibility desired at any cost? How is the choice of pipeline affecting performance? The purpose of this master thesis is to evaluate the performance of the two pipelines. This will be achieved by developing a benchmark program, which measures performance when various graphical effects are tested. The results of the evaluation will hopefully help developers to decide which pipeline to use, in terms of performance. In the end we will see that the fixed function pipeline is faster than the programmable pipeline in all our tests.

  • 499. Hong, Seung-Sun
    et al.
    Wong, Fiona
    Wu, Felix
    Lilja, Bjorn
    Jansson, Tony Y.
    Johnson, Henric
    Nilsson, Arne A.
    TCPtransform: Property-Oriented TCP Traffic Transformation2005Conference paper (Refereed)
    Abstract [en]

    A TCPdump file captures not only packets but also various "properties" related to the live TCP sessions on the Internet. It is still an open problem to identify all the possible properties, if ever possible, and more importantly, which properties really matter for the consumers of this particular TCPdump file and how they are related to each other. However, it is quite clear that existing traffic replay tools, for the purpose of system evaluation, such as TCPreplay destroyed at least some of critical properties such as "ghost acknowledgment" (while the origin packet has never been delivered), which is a critical issue in conducting experimental evaluations for intrusion detection systems. In this paper, we present a software tool to transform an existing TCPdump file into another traffic file with different "properties". For instance, if the original traffic is being captured in a laboratory environment, the new file might "appear" to be captured in between US and Sweden. The transformation we have done here is "heuristically consistent" as there might be some hidden properties still being destroyed in the transformation process. One interesting application of our tool is to build long-term profiles to detect anomalous TCP attacks without really running the target application over the Internet. While, in this paper, we only focus on property-oriented traffic transformation, we have built and evaluated an interactive version of this tool, called TCPopera, to evaluate commercial intrusion prevention systems.

  • 500.
    Hossain, Firoz
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Sohab, Abu-Shadat-Mohammad
    Blekinge Institute of Technology, School of Engineering, Department of Signal Processing.
    Mathematical Modelling of Call Admission Control in WCDMA Network2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    WCDMA is an interference limited multiple access technique .It is widely used in the 3rd generation mobile networks like UMTS. When a new call arrives in the system to get admission, it checks whether the call is admitted or not based on some parameters like signal to interference ratio (SIR), transmission power of the Node B and the air interface load .If the call is accepted this will increase some interference to the ongoing calls. This new interference would degrade the ongoing calls and this will also add some extra load which may also lead to the exceeding capacity. So that the system has to decide this admission policy in a systematic way that all the users should maintain their communication with guaranteed quality of service. This decision making algorithm belongs to the radio resource management functionalities of the Radio Network Controller (RNC) in a WCDMA based UMTS network. This thesis paper focuses on the mathematical representation of the call admission control in an interference based environment. There is also a comparative study with different methods.

78910111213 451 - 500 of 1407
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf