Change search
Refine search result
47484950 2451 - 2460 of 2460
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 2451.
    Álvarez, Carlos García
    Blekinge Institute of Technology, School of Computing.
    Overcoming the Limitations of Agile Software Development and Software Architecture2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context. Agile Software Development has provided a new concept of Software Development based in adaptation to changes, quick decisions, low high-level design and frequent deliveries. However, this approach ignores the value that Software Architecture provides in the long term for increasing the speed in delivery working software, which may have catastrophic consequences in the long term. Objectives. In this study, the combination of these two philosophies of Software Development is investigated. Firstly, the concept of Software Architecture in Agile Projects; then, the major challenges faced concerning Software Architecture in Agile Projects, the practices and approaches that can be used to overcome these challenges and the effects that these practices may cause on the project. Methods. The research methodologies used in this study are Systematic Literature Review for gathering the highest amount possible of contributions available in the Literature at this respect, and also the conduction of Semi-Structured Interviews with Agile Practitioners, in order to obtain empirical knowledge on the problem and support or deny the SLR findings. Results. The results of the Thesis are a unified description of the concept of Software Architecture in Agile Projects, and a collection of challenges found in agile projects, practices that overcome them and a relation of effects observed. Considering the most frequent practices/approaches followed and the empirical support, it is enabled a discussion on how to combine Software Architecture and Agile Projects. Conclusions. The main conclusion is that there is not a definite solution to this question; this is due to the relevance of the context (team, project, customer, etc.) that recommends the evaluation of each situation before deciding the best way to proceed. However, there are common trends on the best-recommended practices to integrate these two concepts. Finally, it is required more empirical work on the issue, the conduction of controlled experiments that allows to quantify the success or failure of the practices implemented would be most helpful in order to create a body of knowledge that enables the application of certain practices under certain conditions.

  • 2452. Åberg, Hampus
    Subimage matching in historical documents using SIFT keypoints and clustering2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context: In this thesis subimage matching in historical handwritten documents using SIFT (Scale-Invariant Feature Transform) keypoints was tested. SIFT features are invariant to scale and rotation and have gained a lot of interest in the research community. The historical documents used in this thesis orignates from 16th century and forward. The following steps have been executed; binarization, word segmentation, feature identification and clustering. The binarization step converts the images into binary images. The word segmentation separates the different words into individual subimages. In the feature identification SIFT keypoints was found and descriptors was computed. The last step was to cluster the images based on the distances between the set of image features identified. Objectives: The main objectives are to find a good configuration for the binarization step, implement a good word segmentation, identify image features and lastly to cluster the images based on their similarity. The context from subimages are matched to each other rather than trying to predict what the context of a subimage is, simply because the data that has been used is unlabeled. Methods: Implementation were the main methodology used combined with experimentation. Measurements were taken throughout the development and accuracy of word segmentation and the clustering is measured. Results: The word segmentation got an average accuracy of 89\% correct segmentation which is comparable to other word segmentating results. The clustering however matched 0% correctly.Conclusions: The conclusions that have been drawn from this study is that SIFT keypoints are not very well suited for this type of problem which includes a lot of handwritten text. The descriptors were not discriminative enough and different keypoints were found in different images with the same handwritten text, which lead to the bad clustering results.

  • 2453. Ådahl, Kerstin
    et al.
    Gustavsson, Rune
    Innovative Health Care Channels: Towards Declarative Electronic Decision Support Systems Focusing on Patient Security.2009Conference paper (Refereed)
    Abstract [en]

    Models supporting empoerment of health care teams and patients are introduced and exemplified

  • 2454.
    Åkesson, Gustav
    et al.
    Blekinge Institute of Technology, School of Computing.
    Rantzow, Pontus
    Blekinge Institute of Technology, School of Computing.
    Performance evaluation of multithreading in a Diameter Credit Control Application2010Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Moore's law states that the amount of computational power available at a given cost doubles every 18 months and indeed, for the past 20 years there has been a tremendous development in microprocessors. However, for the last few years, Moore's law has been subject for debate, since to manage heat issues, processor manufacturers have begun favoring multicore processors, which means parallel computation has become necessary to fully utilize the hardware. This also means that software has to be written with multiprocessing in mind to take full advantage of the hardware, and writing parallel software introduces a whole new set of problems. For the last couple of years, the demands on telecommunication systems have increased and to manage the increasing demands, multiprocessor servers have become a necessity. Applications must fully utilize the hardware and such an application is the Diameter Credit Control Application (DCCA). The DCCA uses the Diameter networking protocol and the DCCA's purpose is to provide a framework for real-time charging. This could, for instance, be to grant or deny a user's request of a specific network activity and to account for the eventual use of that network resource. This thesis investigates whether it is possible to develop a Diameter Credit Control Application that achieves linear scaling and the eventual pitfalls that exist when developing a scalable DCCA server. The assumption is based on the observation that the DCCA server's connections have little to nothing in common (i.e. little or no synchronization), and introducing more processors should therefore give linear scaling. To investigate whether a DCCA server's performance scales linearly, a prototype has been developed. Along with the development of the prototype, constant performance analysis was conducted to see what affected performance and server scalability in a multiprocessor DCCA environment. As the results show, quite a few factors besides synchronization and independent connections affected scalability of the DCCA prototype. The results show that the DCCA prototype did not always achieve linear scaling. However, even if it was not linear, certain design decisions gave considerable performance increase when more processors were introduced.

  • 2455.
    Årsköld, Martin
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Processoperatörens mobilitet -teknikstöd för mobil larmhantering2002Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [sv]

    Processindustrin har möjligheten att ta steget in i en ny utvecklingsfas där teknik kommer att spela stor roll. I och med att teknik allt mer stödjer mobilitet öppnas möjligheten för processoperatörer att på valfri plats kunna övervaka och styra tillverkningsprocessen. Med denna rapport presenterar författaren sin empiriska studie på en högteknologisk fabrik. Studiens fokus ligger i betydelsen av processoperatörernas mobilitet och hur den framträder i deras arbete. Studien visar att mobilitet för operatörerna på fabriken är en del av deras yrkesutövande och väsentlig för att kunna styra tillverkningsprocessen. Utifrån studien ges förslag på teknik som kan stödja denna mobilitet genom att möjliggöra mobil larmhantering.

  • 2456.
    Åström, Fredrik
    Blekinge Institute of Technology, School of Computing.
    Neural Network on Compute Shader: Running and Training a Neural Network using GPGPU2011Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    In this thesis I look into how one can train and run an artificial neural network using Compute Shader and what kind of performance can be expected. An artificial neural network is a computational model that is inspired by biological neural networks, e.g. a brain. Finding what kind of performance can be expected was done by creating an implementation that uses Compute Shader and then compare it to the FANN library, i.e. a fast artificial neural network library written in C. The conclusion is that you can improve performance by training an artificial neural network on the compute shader as long as you are using non-trivial datasets and neural network configurations.

  • 2457.
    Örtegren, Kevin
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Clustered Shading: Assigning arbitrarily shaped convex light volumes using conservative rasterization2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context. In this thesis, a GPU-based light culling technique performed with conservative rasterization is presented. Accurate lighting calculations are expensive in real-time applications and the number of lights used in a typical virtual scene increases as real-time applications become more advanced. Performing light culling prior to shading a scene has in recent years become a vital part of any high-end rendering pipeline. Existing light culling techniques suffer from a variety of problems which clustered shading tries to address.

    Objectives. The main goal of this thesis is to explore the use of the rasterizer to efficiently assign convex light shapes to clusters. Being able to accurately represent and assign light volumes to clusters is a key objective in this thesis.

    Methods. This method is designed for real-time applications that use large amounts of dynamic and arbitrarily shaped convex lights. By using using conservative rasterization to assign convex light volumes to a 3D cluster structure, a more suitable light volume approximation can be used. This thesis implements a novel light culling technique in DirectX 12 by taking advantage of the hardware conservative rasterization provided by the latest consumer grade Nvidia GPUs. Experiments are conducted to prove the efficiency of the implementation and comparisons with AMD´s Forward+ tiled light culling are provided to relate the implementation to existing techniques.

    Results. The results from analyzing the algorithm shows that most problems with existing light culling techniques are addressed and the light assignment is of high quality and allows for easy integration of new convex light types. Assigning the lights and shading the CryTek Sponza scene with 2000 point lights and 2000 spot lights takes 2.92ms on a GTX970.

    Conclusions. The conclusion shows that the main goal of the thesis has been reached to the extent that all existing problems with current light culling techniques have been solved, at the cost of using more memory. The technique is novel and a lot of future work is outlined and would benefit the validity of the implementation if further researched.

  • 2458. Östlund, Louise
    Information in use: In- and outsourcing aspects of digital services2007Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis is founded on the global growth of the service sector and its significance for society as a whole and for the individual human being. In the last decade, technology has changed the way services are created, developed and delivered in remarkable ways. The focus of the thesis is technology in interplay with humans and organisations and the socio-economic-technical systems in which digital services play a central role. Challenges addressed by the thesis include requirement analysis, trustworthy systems, in- and outsourcing aspects, the proper understanding of information and its use in real world applications. With this in mind, the thesis presents a configurable methodology with the purpose to quality assure service oriented workflows found in socio-economic-technical systems. Important building blocks for this are information types and service supported workflows. Our case study is of a call centre-based business called AKC (Apotekets kundcentrum). AKC constitutes a part of the Cooperation of Swedish Pharmacies (Apoteket AB). One of their main services offered to Swedish citizens is the handling of incoming questions concerning pharmaceutical issues. We analysed the interactive voice response system at AKC as a starting point for our investigations and we suggest a more flexible solution. We regard a socio-economic-technical system as an information ecology, which puts the focus on human activities supported by technology. Within these information ecologies, we have found that a Service Oriented Architecture (SOA) can provide the flexible support needed in an environment with a focal point on services. Input from information ecologies and SOA also enables a structured way of managing in- and outsourcing issues. We have also found that if we apply SOA together with our way of modelling a Service Level Agreement (SLA), we can coordinate high-level requirements and support-system requirements. A central insight in this work is the importance of regarding a socio-economic-technical system as an information ecology in combination with in- and outsourcing issues. This view will prevent a company from being drained of its core competences and core services in an outsourcing situation, which is further discussed in the thesis. By using our combination of SOA and SLA we can also divide service bundles into separate services and apply economic aspects to them. This enables us to analyse which services that are profitable while at the same time meet important requirements in information quality. As a result, we propose a set of guidelines which represent our approach towards developing quality assured systems. We also present two main types of validation for service oriented workflows: validation of requirement engineering and validation of business processes.

  • 2459.
    Özcan, Mehmet Batuhan
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    Iro, Gabriel
    Blekinge Institute of Technology, Faculty of Computing, Department of Communication Systems.
    PARAVIRTUALIZATION IMPLEMENTATION IN UBUNTU WITH XEN HYPERVISOR2014Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    With the growing need for efficiency, cost reduction, reduced disposition of outdated electronics components as well as scalable electronics components, and also reduced health effects of our daily usage of electronics components. Recent trend in technology has seen companies manufacturing these products thinking in the mentioned needs when manufacturing and virtualizations is one important aspect of it. The need to share resources, the need to use lesser workspace, the need to reduce cost of purchase and manufacturing are all part of achievements of virtualization techniques. For some people, setting up a computer to run different virtual machines at the same time can be difficult especially if they have no prior basic knowledge of working in terminal environment and hiring a skilled personnel to do the job can be expensive. The motivation for this thesis is to help people with little or no basic knowledge on how to set up virtual machine with Ubuntu operating system on XEN hypervisor.

  • 2460.
    Özgür, Turhan
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Comparison of Microsoft DSL Tools and Eclipse Modeling Frameworks for Domain-Specific Modeling in the context of Model-Driven Development2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Today it is realized by industry that automation of software development leads to increased productivity, maintainability and higher quality. Model-Driven Development (MDD) aims to replace manual software development methods by automated methods using Domain-Specific Languages (DSLs) to express domain concepts effectively. Main actors in software industry, Microsoft and IBM have recognized the need to provide technologies and tools to allow building DSLs to support MDD. On the one hand, Microsoft is building DSL Tools integrated in Visual Studio 2005; on the other hand IBM is contributing to the development of Eclipse Modeling Frameworks (EMF/GEF/GMF), both tools aim to make development and deployment of DSLs easier. Software practitioners seek for guidelines regarding how to adopt these tools. In this thesis, the author presents the current state-of-the-art in MDD standards and Domain-Specific Modeling (DSM). Furthermore, the author presents current state-of-the-tools for DSM and performs a comparison of Microsoft DSL Tools and Eclipse EMF/GEF/GMF Frameworks based on a set of evaluation criteria. For the purpose of comparison the author developed two DSL designers (one by using each DSM tool). Based on the experiences gained in development of these DSL designers, the author prepared guidelines regarding how to adopt these tools to existing development environments as well as their advantages and drawbacks.

47484950 2451 - 2460 of 2460
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf