Change search
Refine search result
123456 151 - 200 of 271
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 151. Kao-Walter, Sharon
    Mechanical and Fracture Properties of Thin Al-foil2001Report (Other academic)
    Abstract [en]

    Mechanical and fracture behaviour of thin Aluminium foil (with a thickness of 6-9 mm) was studied. Tensile tests and fracture toughness tests of different material thickness and different specimen size have been performed. Results influence by the size of specimen has been discussed.

  • 152.
    Karim, Omair
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Electronic frequency controller2015Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The project was performed with universal electronics which is based in Karachi Pakistan at

    PRD Lab. Due to the energy crises the industry is switching to alternative power generators.

    The main issue with the maintenance of the generator is the constant failure of the controller

    which has to be imported ,the purpose of this project is to make a cheap efficient Electronic

    Frequency Controller (EFC) that can be used in almost all kinds of generators. The work was

    tested on Multisim and then soldered together. To make it functional the frequency generation

    was converted to voltage which was connected to the controller. The controller was joined

    with the actuator and the actuator opens the amount of fuel needed to run the generator so that

    there is a smooth constant voltage. If we have high frequency from the input like a heavy

    machinery is operational then the actuator will widen its opening and when the frequency is

    low the actuator will open appropriately. The presentation was a successful one but due to

    time deadline few minor adjustment could be made to make it more effective as discussed

    later.

  • 153. Ketola, Katja
    Araby-projektet. En utvärdering av ett lokalt utvecklingsarbete i bostadsområdet Araby i Växjö1998Report (Other academic)
    Abstract [en]

    I denna rapport redovisas en utvärdering av lokalt utvecklingsarbete i bostadsområdet Araby i Växjö. Araby är ett storskaligt bostadsområde med ca 4900 invånare (1995). Växjö kommun beslutade 1996 att påbörja ett lokalt utvecklingsarbete i Araby. Utvärderingen syftar till att det lokala utvecklingsarbetet med avseende på: 1. Hur samspelet mellan lokala och kommunala aktörer har sett ut 2. Hur lokala initativ till utveckling har fått stöd av kommunal planering 3. Hur samverkan inom kommunen och mellan kommunen och andra aktörer har fungerat 4. Hur medborgarinflytande har kommit till uttryck och 5. Hur nya former för planering har använts.

  • 154. Khan, Muhammad Gufran
    et al.
    Nordberg, Jörgen
    Performance Evaluation of RAKE Receiver for Low Data Rate UWB Systems using Multipath Channels for Industrial Environments2008Report (Other academic)
    Abstract [en]

    Since US FCC passed a resolution in 2002 allowing ultra wideband (UWB) transmissions within a specified unlicensed spectral mask, the interest in UWB technology has grown tremendously. The large bandwidth, low power spectral density (PSD), high multiple access capability and high resolution are some qualities of UWB technology. For UWB communication systems, the industrial environments are an important scenario. However, due to large number of metallic scatterers in the environment, the multipath offered by UWB channels is dense and many multipath components have significant energy. In this report, the performance evaluation of RAKE receivers for a single user system operating in non-line-of-sight (NLOS) scenarios in industrial environments is presented. The channels used for the evaluation are measured in a medium-sized industrial environment. In addition, the standard IEEE 802.15.4a channel model for NLOS industrial environment is used for comparison with the results of the measured channels. The performance is compared for partial RAKE (PRake) and selective RAKE (SRake) in terms of uncoded bit-error-rate (BER) with the assumption that the channel is known. The effect of different number of fingers on BER of PRake and Srake is studied. Moreover, the performance of maximal ratio combining (MRC) and equal gain combining (EGC) is compared for PRake and Srake receiver. The results also provide a performance comparison between different Tx-Rx separations. Finally, based on the simulation results, conclusions are presented considering the performance and complexity issues.

  • 155. Khan, Muhammad Gufran
    et al.
    Nordberg, Jörgen
    Recursive Transmitted Reference Receivers for Impulse Radio UWB Systems2008Report (Other academic)
    Abstract [en]

    For the detection of impulse radio (IR) UWB signals, RAKE receivers or transmitted reference (TR) autocorrelation receivers can be used. The complexity of RAKE receiver increases significantly when the number of received multipath components is large. The TR scheme is a low-complexity alternative as it does not require channel estimation. However, there is a performance loss associated with the low-complexity TR scheme. The recursive structures of the conventional TR and averaged TR schemes are presented to improve the detection performance of IR-UWB signals. The performance of proposed schemes is evaluated over the standard IEEE 802.15.4a multipath channels. The performance is compared with conventional TR receivers in terms of uncoded bit-error-rate (BER), assuming that the channel is quasi-static. For averaged and recursive averaged TR schemes, the TR sequence is also slightly modified. The simulation results validate that the proposed schemes have better performance by about 2 dB than the conventional TR and averaged TR receivers.

  • 156.
    Koppula, Thejendar Reddy
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Regression Testing Goals and Measures: An industrial approach2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context: When a software is modified, regression testing is performed to ensure the behaviour of software is not affected because of those modifications. Due to frequent modifications, the regression testing became challenging. Although there are many regression testing techniques are developed in the research, they are not incorporating in the industry. This is because of the differences in regression testing goals and measures in research and industry. The current context of this study is to identify the regression testing goals and measures in the research and industry perspectives and to find the differences and similarities in both perspectives.

    Objectives: The primary objective of this study is to identify the similarities and differences in regression testing goals and measure from research and industry perspectives. Additionally, in this study, a general adapted goals list is presented.

    Methods: A mixed method approach is used for this study. A literature review has been used to identify the regression testing goals and measures in research. A survey is used to identify the regression testing goals and measures in the industry. Semi-structured interviews and online questionnaire are used as data collection methods in the survey. Thematic analysis and descriptive statistics are used as data analysis methods for the qualitative and quantitative data.

    Results: A literature review is conducted using 33 research articles. In the survey, the data is collected from 11 semi-structured interviews which are validated with 45 responses from an online questionnaire. A total of 6 regression testing goals are identified from the literature review and 8 goals are identified in the survey respectively. The measures used to evaluate these goals are identified and tabulated.

    Conclusions: From the results, we observed the similarities and differences in the regression testing goals and measures in industry and research. There are few similarities in goals but the major difference is the priority order of these goals. There are various measures used in research but very fewer measures are incorporating in the industry. The respondents from the survey implied that there is a need for generic adaptive goals. Further, a general list of goals is presented.

    Keywords: Regression, Regression testing, Goals, Objectives, Measures, Metrics.

  • 157. Kuzniarz, Ludwik
    Proceedings of the 2nd Educators' Symposium2007Report (Other academic)
    Abstract [en]

    Preface Putting the model-driven development (MDD) approaches and technologies for software-based systems vision, in which development is centered round the manipulation of models, into practice requires not only sophisticated modeling approaches and tools, but also considerable training and education efforts. To make people ready for MDD, its principles and applications need to be taught to practitioners in industry, incorporated in university curricula, and probably even introduced in schools. Industry is striving to improve their practice of software development by adopting MDD. The adoption, however, is determined by the availability of skilled software engineers who are educated and trained in modeling and model-driven development. On the other hand, teaching model-driven development skills slowly influences the practices in industry with an increasing number of graduates capable for realizing the vision of MDD The educator's symposium at the MoDELS conference, the premier conference devoted to the topic of model-driven engineering of software-based systems, is intended as a forum in which educators and trainers can meet to discuss pedagogy, use of technology in the classroom, and share their experience relevant to teaching modeling techniques and model-driven development. The first educators symposium was organized at MoDELS 2005. The leading topic of this symposium is the synergy between industrial needs, influences on education and vice versa. A special emphasis will be put on the synergy between industrial needs and university education. The papers accepted for presentation address the issues of industrial relevance of the education, which was one of the main topics of the symposium, such as: • experiences with teaching modeling throughout the software engineering curriculum • using project-based learning as a vehicle for teaching modeling • teaching modeling through student projects where parts of tools are implemented • teaching modeling in the context of J2EE applications • using an artificially created software development laboratory as a means of enhancing the motivation for learning modeling All papers determine model-driven software development as the necessary skills for the future software developers. The diversity of authors from various countries from 2 continents provides an opportunity to compare the industrial views on modeling – from modeling being a desired skill in industry to modeling being only a surplus (while the foreseen competence was in the tools and technologies). Ludwik Kuzniarz Symposium Chair

  • 158. Kuzniarz, Ludwik
    et al.
    Huzar, Zbigniew
    Reggio, Gianna
    Sourrouille, Jean Louis
    Staron, Miroslaw
    Workshop on Consistency Problems in UML-based Software Development II2003Report (Other academic)
    Abstract [en]

    Workshop materials of the Second Workshop on Consistency Problems in UML-based Software Development. The workshop is part of the Sixth International Conference on The Unified Modeling Language <<UML>> 2003

  • 159. Kuzniarz, Ludwik
    et al.
    Sourrouille, Jean LouisStaron, Miroslaw
    Proceedings of the 2nd Workshop on Quality in Modeling2007Report (Other academic)
    Abstract [en]

    Preface Quality constitutes an important topic in software engineering and becomes an essential issue in studies on using models in software engineering. Software quality management is already widely researched and approached from multiple perspectives and viewpoints. However, the introduction of a new paradigm in software development – namely Model Driven Development (MDD) – raises new challenges in software quality management, and as such should be given a special attention. The goal of this workshop was to gather researchers and practitioners interested in the emerging issues of quality in the context of MDD. The workshop is intended to provide a forum for presentation and discussion of emerging issues related to software quality in MDD. The workshop is built upon the experience and discussions during the previous workshop on Quality in Modeling and a series of workshops on model consistency held annually at the UML and at EC-MDA conferences. The intention of this year’s workshop was to extend the scope of the previous activities and to contribute with possible post workshop joint activities. During the last year workshop at MoDELS 2006 there was a consensus that the issues of quality of models need to be researched from various perspectives – both industry practices and academic research hence industry perspective is intended to get a particular attention in this edition. Workshop is divided into two parts: Presentation part – when the accepted paper contributions were presented and discussed, Working part – when a guided discussion was conducted aimed at elaboration of a common quality model. The presentation part consists of two sessions, dedicated mainly for the presentation of accepted paper contributions with ample time allocated for questions and discussion and is structured into two sessions: Quality frameworks and models, Quality in practice. The rationale behind the working part was to carry out a prearranged discussion on a structured approach to quality in modeling, aimed at acquiring a common quality model. A short introduction is to be presented, followed by concise position statements of all participants based on addressing three questions related to model quality: What qualities of models and modeling matter? How do they relate to each other? How can they be measured? The statements are to be based on an existing quality framework, sent to the participants, aimed at unifying and structuring the answer to the questions. An ample time for discussion based on the statements was allocated. The contributions will be combined into a common quality model which will be published in the workshop results report.

  • 160. Kuzniarz, Ludwik
    et al.
    Sourrouille, Jean LouisStaron, MiroslawChaudron, MichelStraeten, Ragnhild van der
    Proceedings of the 1st Workshop on Quality in Modeling2007Report (Other academic)
    Abstract [en]

    Preface Quality assessment and assurance constitute an important part of software engineering. The issues of software quality management are widely researched and approached from multiple perspectives and viewpoints. The introduction of a new paradigm in software development – namely Model Driven Development (MDD) and its variations (e.g., MDA [Model Driven Architecture], MDE [Model Driven Engineering], MBD [Model Based Development], MIC [Model Integrated Computing]) – raises new challenges in software quality management, and as such should be given a special attention. In particular, the issues of early quality assessment, based on models at a high abstraction level, and building (or customizing the existing) prediction models for software quality based on model metrics are of central importance for the software engineering community. The workshop is continuation of a series of workshops on consistency that have taken place during the subsequent annual UML conferences and recently MDA-FA. The idea behind this workshop is to extend the scope of interests and address a wide spectrum of problems related to MDD. It is also in line with the overall initiative of the shift from UML to MoDELS. The goal of this workshop is to gather researchers and practitioners interested in the emerging issues of quality in the context of MDD. The workshop is intended to provide a premier forum for discussions related to software quality and MDD. And the aims of the workshop are: - Presenting ongoing research related to quality in modeling in the context of MDD, - Defining and organizing issues related to quality in the MDD. The format of the workshop consists of two parts: presentation and discussion. The presentation part is aimed at reporting research results related to quality aspects in modeling. Seven papers were selected for the presentation out of 16 submissions; the selected papers are included in these proceedings. The discussion part is intended to be a forum for exchange of ideas related to understanding of quality and approaching it in a systematic way. Ludwik Kuzniarz Workshop chair

  • 161. Kuzniarz, Ludwik
    et al.
    Staron, Miroslaw
    Hellman, Erik
    Extracting information about domain structure from DAML+OIL encoded ontologies into UML2002Report (Other academic)
    Abstract [en]

    The report presents and elaborates on the details of knowledge acquisiton process from ontologies into domain models. It identifies the knowledge about the domain structure which already exist in form of ontologies, and it also gives the justification why is this knowledge important from domain models perspective. The general idea along with the detailed description and implementation of the process is presented. As the process is based on various XML based technologies, these are shown and described. A small example is introduced for depiction of the practical usage of the method.

  • 162. Kågström, Simon
    et al.
    Grahn, Håkan
    Lundberg, Lars
    The Design and Implementation of Multiprocessor Support for an Industrial Operating System Kernel2005Report (Other academic)
    Abstract [en]

    The ongoing transition from uniprocessor to multiprocessor computers requires support from the operating system kernel. Although many general-purpose multiprocessor operating systems exist, there is a large number of specialized operating systems which require porting in order to work on multiprocessors. In this paper we describe the multiprocessor port of a cluster operating system kernel from a producer of industrial systems. Our initial implementation uses a giant locking scheme that serializes kernel execution. We also employed a method in which CPU-local variables are placed in a special section mapped to per-CPU physical memory pages. The giant lock and CPU-local section allowed us to implement an initial working version with only minor changes to the original code, although the giant lock and kernel-bound applications limit the performance of our multiprocessor port. Finally, we also discuss experiences from the implementation.

  • 163. Lagö, Thomas L
    Frequency Analysis of Helicopter Sound in the AS332 Super Puma1996Report (Other academic)
    Abstract [en]

    This technical report describes a series of measurements performed on an AS332 Super Puma, MKII (HKP10) helicopter. The measurements are part of a research project, A New Generation Active Headsets and its Psychological Effects, financed by the KKS board (Board of Knowledge and Competence). The project participants are: Lindholmen Development, Hellberg Safety, Active Control and the University of Karlskrona/Ronneby. CelsiusTech has recently joined the project as an industrial partner. The Air Force base at F17 in Kallinge and the AMI group in Ronneby are involved as evaluation groups. There are substantial noise levels in helicopters, especially at low frequency. These noise levels are normally not harmful to the ear. However, the low frequency content masks the speech. For this reason, pilots tend to set the intercom system at maximum sound level, producing potentially damaging sound levels for the human ear. Dr. P-A Hellström at Lindholmen Development has measured almost 100 dBA inside the ear canal when the intercom system is in use. This high sound level exposes the ear to fatigue and hearing loss. The background noise is the key reason for the problem, although it is not the key source of sound damage to the ear. The frequency content in the masking background sound is of great importance. It was thus important to investigate if the sound consisted of pure tones or if it was more broadband in nature. The dominant sound sources needed to be identified and the number of harmonics for each source established. It was also important to investigate if there was a strong connection between the structure borne and the air borne sound.

  • 164. Lagö, Thomas L
    et al.
    Olsson, Sven
    Various Signal Processing Techniques used on non-Stationary Acoustic Doppler Current Data. Volume I: XI1999Report (Other academic)
    Abstract [en]

    Chapter 1 - Background, deals with the process of analyzing the backscattering signal transmitted from an ultrasonic transducer, [5][28]. The narrowband sinusoidal burst signal is Doppler-shifted due to the current, and this information is converted into current, [14]. The traditional mathematical model for this Doppler process is based on the assumption that the backscattering time signal is Gaussian, due to the Rayleigh backscattering amplitude assumption with random phase, [23][27]. This is based on the assumption that the backscattering is due to many randomly distributed bubbles with about equal size. It is reasonable to question whether this assumption holds for real life signals, [ 1][7][8]. Therefore, this work has concentrated on looking at real life data, and has investigated whether the Gaussian assumption holds for the background noise and the Doppler signal received. It has been found that this is not generally the case. Chapter 2 - Spectral Analysis of Data, provides analysis of the spectral content in the data using tools with different properties. The reason is the difficulty in distinguishing real spectral peaks in the data from peaks coming from variance in the estimate, [2][3]. Therefore, 3D-plots have been generated of current data from four locations around the world with very different environments. Also, a non-linear filtering method named Multiple Peak Count Analysis, MPCA, has been developed. This analysis is most important in understanding if there is more than one Doppler signal component (current) active in the measurement cell analyzed. Using these two methods, which use different foundations for the analysis, it is possible to determine if, and often, how many, Doppler signals are active in one cell. This compares to how many spectral peaks the data contains for each observation interval. Chapter 3 - Statistical Measures provides an analysis of the data using classical statistical tools like histo.gram, normal probability plots, Chi-square tests and variance analysis like ANOVA, ANalysis Of Variance, [2 1][26]. These tools helps in understanding if it is possible to use a Gaussian approach for the signal model, or if some other distribution could be better suited. Data from all four locations are used in the analysis and key results are presented. In this chapter, analysis of the background noise is also analyzed and presented using the above statistical measures. Chapter 4 - Higher Order Moments, provides a description of higher order moments using skewness y, and kurtosis y2. These are important tools of the statistical behavior of the data analysis. The .investigation of the higher order moments for the time series of the three ADCMs, does not contradict the proposed signal model. Furthermore, the real world signals converge very much to what can be expected if this new model is adequate for this kind of signal. The conclusion is, then, that the model holds for this test. The data is found not to obey the Gaussian signal model in general. This is particularly true when the water is troubled. A comparison with real data from four different locations presented above has been performed and all data shows the same trends, the data cannot be modeled using Gaussian statistical properties. The 3D plots presented earlier show that there often are several current vectors active in a cell at the same time, and this has a strong effect on the statistics for the time signal, which is quantified in this chapter. Chapter 5 - Comparison of Estimators, provides an extensive comparison between the covariance method and the Symmiktos MethodTM. Simulated and real data from all four locations have been used in the comparison. The comparison is presented in several formats to make conclusions easier. It is clear that the Symmiktos Method*M generates quite different results from those of the covariance method. On simulated data, the Symmiktos MethodTM is much closer to the simulated truth. However, in real life we don’t know the answer, so it is impossible to be sure which estimator is more accurate. Based on the results from the simulated signals and also noticing that the variance is lower when using the Symmiktos Method M, plus adding the results from the signal model together, it is fairly safe to argue that the Symmiktos Method*M is a more robust and accurate method for Doppler frequency estimation on this type of data. Chapter 6 - Estimator Programs, gives a brief background on the imple-mentation of the main Matlab programs used in the calculations, and the most important programs for understanding of the work, are listed. The programs listed are not only the statistical programs but also the programs used for testing a new signal model and comparing the covariance method with the Symmiktos Method. Chapter 7 - Description of Used Data Sets, gives a brief background on the data sets used in this research. Key data from the four locations is given as well as all the background parameters to when and how the data was collected, as well as the main observations made at the time of data collection. ) Chapter 8 - Summary and Conclusions, provides a summary of the key results from the four different locations. Each method is commented individually and the main effects are discussed. Chapter 9 : References, lists all the references used. Chapter 10 - Listing of Measurement Plots, lists all the plots. The plots consist of about 2000 pages divided over 11 volumes.

  • 165. Landqvist, Ronnie
    et al.
    Mohammed, Abbas
    The Projection Approximation Subspace Tracking Algorithm Applied to Whitening and Independent Component Analysis in Wireless Communications2005Report (Other academic)
    Abstract [en]

    In Blind Source Separation (BSS) the objective is to extract source signals from their linear mixtures. Algorithms developed for Independent Component Analysis (ICA) have proven useful in the field of BSS. The Projection Approximation Subspace Tracking with Deflation (PASTD) algorithm, originally developed for subspace tracking, has been extended by using a nonlinear cost function so that it may be used for ICA/BSS. Such algorithms most often require the input signals to be white. In this report we extend the PASTD algorithm so that it can be used to whiten signals as a pre-processing step before ICA. The performance of the ICA-algorithm is then evaluated for different choices of whitening algorithms. The algorithms are also evaluated for Binary Phase Shift Keying (BPSK) modulated data over Rayleigh fading channels usually encountered in wireless communications.

  • 166. Larsson, Martin
    et al.
    Johansson, Sven
    Håkansson, Lars
    Claesson, Ingvar
    Performance Evaluation of a Module Configured Active Silencer for Robust Active Noise Control of Low Frequency Noise in Ducts2008Report (Other academic)
    Abstract [en]

    Low noise level is an essential feature when installing ventilation systems today. Since the passive silencers traditionally used to attenuate ventilation noise tend to become bulky, impractical, and expensive when designed for low frequency attenuation, other solutions for the reduction of the low frequency duct noise often present in ducts are of interest. Active noise control (ANC) is a well known method for attenuating low frequency noise and much research has been performed to successfully apply ANC to duct noise. To insure reliable operation and desirable levels of attenuation when applying ANC to duct noise, it is of highest importance to be able to suppress the contamination of the microphone signals due to the turbulent pressure fluctuations arising as the microphones are exposed to the airflow in the duct. The work presented in this report is concerned with analysis of the influence of the turbulence induced noise on the adaptive algorithm in the ANC system, and design of microphone installations which produce sufficient turbulence suppression while also meeting industrial requirements. These requirements are, for example, that the installations should be based on standard ventilation parts, and that they should be easily installed and maintained. Furthermore, results concerning the performance of an ANC system with different microphone installations are presented. Some of the results were obtained at an acoustic laboratory according to an ISO standard. The attenuation of duct noise achieved with ANC was approximately 15-25 dB between 50-315 Hz, even for airflow speeds up to 20 m/s.

  • 167. Larsson, Stefan
    Problematisering av vindkraftens regelverk: en pilotstudie2009Report (Other academic)
    Abstract [sv]

    Detta är en pilotstudie som problematiserar vindkraftens regelverk i Sverige utifrån ett rättssociologiskt och planeringsjuridiskt perspektiv. Under hösten 2008 kom ett betänkande från miljöprocessutredningen gällande en effektivisering av prövning av vindkraft (SOU 2008:86). Bakgrunden var att krav på att en snabbare och enklare process från projektering till uppförande av vindkraftverk hade rests i takt med en ökad nationell satsning på utbyggnaden av vindkraft. I betänkandet ges förslag utifrån ett lagteknisk perspektiv. Pilotstudien visar emellertid att det finns brister i den empiri på vilken förslaget grundas gällande dess systematiskhet och representativitet. För vissa frågor, exempelvis gällande rättspraxis i lägre instanser, finns ingen klar kunskapsbild. Utan en god empiriskt förankrad kunskapsgrund att stå på blir det rimligen också svårt att föreslå förändringar som med säkerhet kommer att leda till en bättre prövning, eller ens en snabbare prövning. Miljöutredningens betänkande utgör en positionering åt ett mer kalkylerande och centraliserat beslutsfattande på bekostnad av det mer lokala och deliberativa. Betänkandet visar på hur de lokala värderingarna ses som ett problem i skenet av den nationella utbyggnadspolicyn. Detta gör att utbyggnaden av vindkraften är principiellt intressant i jämförelse med utbyggnaden av infrastruktur för 3G med tanke på hur en nationell policy hanteras och är beroende av den lokala implementeringen och det spänningsförhållande som råder mellan de två nivåerna. Detta leder till vad pilotstudien kommer fram till som en kärnfråga, nämligen om vem som skall bestämma över landskapets planering och vilken kunskap som skall ligga till grund för sådana beslut. Betänkandet hävdar att kommunerna inte förlorar inflytande över markanvändningen samtidigt som detta påstående inte tydligt underbyggs, det finns tvärtom fog för att just så är fallet. Pilotstudien pekar därmed på att frågan är av en större principiell politisk karaktär som inte enbart bör lösas genom lagtekniska förändringar i effektivitetens namn. Pilotstudien bygger på intervjuer med nyckelpersoner inom svensk vindkraft samt en analys av lagstiftning, med förarbeten och praxis med fokus på miljöprocessutredningens betänkande SOU 2008:86. Pilotstudien ger förslag på vidare studier, exempelvis gällande att ta fram ett systematiskt kunskapsunderlag om prövningssystemet, bl. a. för användning vid tecknandet av en hindertypologi gällande vad som tar tid, för att skilja på det som bör ta tid från det som inte bör ta tid i prövningen. På en mer generell nivå borde man även undersöka vad det innebär att det görs särlösningar och små oberoende revideringar av delarna i ett planeringssystem som i sig är beroende av att dess delar kan fungera väl ihop. Pilotstudien är gjord på uppdrag av Blekinge Tekniska Högskola genom Lars Emmelin, professor vid enheten för fysisk planering, och utförd av Stefan Larsson, teknologie licentiat i fysisk planering, jurist och rättssociolog.

  • 168. Larsson, Sven-Olof
    VPC Management in ATM Networks1998Report (Other academic)
    Abstract [en]

    The goals of VPC management functions are to reduce the call blocking probability, increase responsiveness, stability, and fairness. As telecommunications traffics experience variations in the number of calls per time unit, due to office hours, inaccurate forecasting, quick changes in traffic loads (e.g. New Year's Eve), and changes in the types of traffic (as in introduction of new services), this can be met by adaptive capacity reallocation and topology reconfiguration. Brief explanations of the closely related concepts effective bandwidth and routing are given together with an overview of ATM. Fundamentally different approaches for VPC capacity reallocations are compared and their pros and cons are discussed. Finally, a further development of one of the approaches is described.

  • 169. Lassing, Nico
    et al.
    Bengtsson, PerOlof
    Vliet, Hans van
    Bosch, Jan
    Experiences with SAA of Modifiability2000Report (Other academic)
    Abstract [en]

    Modifiability is an important quality for software systems, because a large part of the costs associated with these systems is spent on modifications. The effort, and therefore cost, that is required for these modifications is largely determined by a system’s software architecture. Analysis of software architectures is therefore an important technique to achieve modifiability and reduce maintenance costs. However, few techniques for software architecture analysis currently exist. Based on our experiences with software architecture analysis of modifiability, we have developed an analysis method consisting of five steps. In this paper we report on our experiences with this method. We illustrate our experiences with examples from two case studies of SAA of modifiability. These case studies concern a system for mobile positioning at Ericsson Software Techology and a system for freight handling at DFDS Fraktarna. Our experiences are related to each step of the analysis process. In addition, we made some observations on SAA of modifiability in general.

  • 170. Lennerstad, Håkan
    Commensurable and rational triangles2007Report (Other academic)
    Abstract [en]

    One may ask which property the equilateral, the right isosceles, the half equilateral, and the two golden triangles, with angles (π/5),((2π)/5),((2π)/5) and (π/5),(π/5),((3π)/5), have in common. One answer is that their angles are commensurable with each other -- such triangles are commensurable. We investigate properties of this class of triangles, which is a countable subset of the entire class of triangles -- we do not distinguish between similar triangles. It can naturally be endowed with a family structure by integer triples. The equilateral is the only member of the first generation, and the other triangles mentioned above populate the first generations. A formula for the number of non-similar triangles that can be formed by triples of corners in a regular n-polygon is calculated, which gives the number of commensurable triangles at each generation. Three "metatriangles" are described -- so called because each possible triangle is represented as a point in each of them. The set of right triangles form a height in one of the metatriangles. The eye is the point of a metatriangle in the same metatriangle. In the second part of this report, triangles are studied by side length. A rational triangle is a triangle where all sides and all heights are rational numbers. We show that the right rational triangles are the Pythagorean triangles, and each non-right rational triangle consists of two Pythagorean triangles. Almost all triangles are irrational. It turns out that no Pythagorean triangle is commensurable. We prove that the only triangle with commensurable angles and also commensurable sides is the equilateral triangle.

  • 171.
    Lennerstad, Håkan
    Blekinge Institute of Technology, Department of Telecommunications and Mathematics.
    The Geometry of the Directional Display1996Report (Refereed)
    Abstract [en]

    The directional display is a new kind of display which can contain and show several images -which particular image is visible depends on the viewing direction. This is achieved by packing information at high density on a surface, by a certain back illumination technique, and by explicit mathematical formulas which make it possible to automatize the printing of a display to obtain desired effects. The directional dependency of the display can be used in several different ways. One is to achieve three-dimensional effects. In contrast to that of holograms, large size and full color here involve no problems. Another application of the basic technique is to show moving sequences. Yet another is to make a display more directionally independent than today’s displays. Patent is pending for the invention in Sweden.

  • 172.
    Lennerstad, Håkan
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    The n-dimensional Stern-Brocot tree2012Report (Other academic)
    Abstract [en]

    The n-dimensional Stern-Brocot tree consists of all sequences (p₁, ...,p_{n}) of positive integers with no common multiple. The relatively prime sequences can be generated branchwise from each other by simple vector summation, starting with an ON-base, and controlled by a generalized Euclidean algorithm.The tree induces a multiresolution partition of the first quadrant of the (n-1)-dimensional unit sphere, providing a direction approximation property of a sequence by its ancestors. Two matrix representations are introduced, where in both a matrix contains the parents of a sequence. From one of them the isomorphism of a subtree to the entire tree of dimension equal to the number of parents of the top sequence follows. A form of Fibonacci sequences turn out to be the sequences of fastest growing sums. The construction can be regarded an n-dimensional continued fraction, and it may invite further n-dimesional number theory.

  • 173. Lennerstad, Håkan
    et al.
    Lundberg, Lars
    An Optimal Execution Time Estimate of Static versus Dynamic Allocation in Multiprocessor Systems1992Report (Other academic)
    Abstract [en]

    Consider a multiprocessor with $k$ identical processors, executing parallel programs consisting of $n$ processes. Let $T_s(P)$ and $T_d(P)$ denote the execution times for the program $P$ with optimal static and dynamic allocations respectively, i. e. allocations giving minimal execution time. We derive a general and explicit formula for the maximal execution time ratio $g(n,k)=\max T_s(P)/T_d(P)$, where the maximum is taken over all programs $P$ consisting of $n$ processes. Any interprocess dependency structure for the programs $P$ is allowed, only avoiding deadlock. Overhead for synchronization and reallocation is neglected. Basic properties of the function $g(n,k)$ are established, from which we obtain a global description of the function. Plots of $g(n,k)$ are included. The results are obtained by investigating a mathematical formulation. The mathematical tools involved are essentially tools of elementary combinatorics. The formula is a combinatorial function applied on certain extremal matrices corresponding to extremal programs. It is mathematically complicated but rapidly computed for reasonable $n$ and $k$, in contrast to the np-completeness of the problems of finding optimal allocations.

  • 174.
    Lennerstad, Håkan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Lundberg, Lars
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Generalizations of the floor and ceiling functions using the Stern-Brocot tree2006Report (Refereed)
    Abstract [en]

    We consider a fundamental number theoretic problem where practial applications abound. We decompose any rational number a/b in c ratios as evenly as possible while maintaining the sum of numerators and the sum of denominators. The minimum and maximum of the ratios give rational estimates of a/b from below and from above. The case c=b gives the usual floor and ceiling functions. We furthermore define the max-min-difference, which is zero iff c≤GCD(a,b), quantifying the distance to relative primality. A main tool for investigating the properties of these quantities is the Stern-Brocot tree, where all positive rational numbers occur in lowest terms and in size order. We prove basic properties such that there is a unique decomposition that gives both the minimum and the maximum. It turns out that this decomposition contains at most three distinct ratios. The problem has arisen in a generalization of the 4/3-conjecture in computer science.

  • 175. Lennerstad, Håkan
    et al.
    Lundberg, Lars
    Optimal Combinatorial Functions Comparing Multiprocess Allocation Performance in Multiprocessor Systems1993Report (Other academic)
    Abstract [en]

    For the execution of an arbitrary parallel program P, consisting of a set of processes, we consider two alternative multiprocessors. The first multiprocessor has q processors and allocates parallel programs dynamically, i.e. processes may be reallocated from one processor to another. The second employs cluster allocation with k clusters and u processors in each cluster - here processes may be reallocated within a cluster only. Let T_d(P,q) and T_c (P,k,u) be execution times for the parallel program P with optimal allocations. We derive a formula for the program independent performance function G(k,u,q)=\sup_ all parallel programs P T_c(P,k,u)}{T_d(P,q)}. Hence, with optimal allocations, the execution of $P$ can never take more than a factor $G(k,u,q)$ longer time with the second multiprocessor than with the first, and there exist programs showing that the bound is sharp. The supremum is taken over all parallel programs consisting of any number of processes. Any interprocess dependency structure is allowed for the parallel programs, except deadlock. Overhead for synchronization and reallocation is neglected only. We further present optimal formulas which exploits a priori knowledge of the class of parallel programs intended for the multiprocessor, thus resulting in sharper optimal bounds. The function g(n,k,u,q) is the above maximum taken over all parallel programs consisting of n processes. The function s(n,v,k,u) is the same maximum, with q=n, taken over all parallel programs of $n$ processes which has a degree of parallelism characterized by a certain parallel profile vector v=(v_1,...,v_n). The functions can be used in various ways to obtain optimal performance bounds, aiding in multiprocessor architecture decisions. An immediate application is the evaluation of heuristic allocation algorithms. It is well known that the problems of finding the corresponding optimal allocations are NP-complete. We thus in effect present a methodology to obtain optimal control of NP-complete scheduling problems.

  • 176. Lennerstad, Håkan
    et al.
    Lundberg, Lars
    Optimal Worst Case Formulas Comparing Cache Memory Associativity1995Report (Other academic)
    Abstract [en]

    Consider an arbitrary program $P$ which is to be executed on a computer with two alternative cache memories. The first cache is set associative or direct mapped. It has $k$ sets and $u$ blocks in each set, this is called a (k,u)$-cache. The other is a fully associative cache with $q$ blocks - a $(1,q)$-cache. We present formulas optimally comparing the performance of a $(k,u)$-cache compared to a $(1,q)$-cache for worst case programs. Optimal mappings of the program variables to the cache blocks are assumed. Let $h(P,k,u)$ denote the number of cache hits for the program $P$, when using a $(k,u)$-cache and an optimal mapping of the program variables of $P$ to the cache blocks. We establish an explicit formula for the quantity $$\inf_P \frac{h(P,k,u)}{h(P,1,q)},$$ where the infimum is taken over all programs $P$ which contain $n$ variables. The formula is a function of the parameters $n,k,u$ and $q$ only. We also deduce a formula for the infimum taken over all programs of any number of variables, this formula is a function of $k,u$ and $q$. We further prove that programs which are extremal for this minimum may have any hit ratio, i.e. any ratio $h(P,1,q)/m(P)$. Here $m(P)$ is the total number of memory references for the program P. We assume the commonly used LRU replacemant policy, that each variable can be stored in one memory block, and is free to be stored in any block. Since the problems of finding optimal mappings are NP-hard, the results provide optimal bounds for NP-hard quantities. The results on cache hits can easily be transformed to results on access times for different cache architectures.

  • 177.
    Lennerstad, Håkan
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Mathematics and Natural Sciences.
    Olteanu, Constanta
    Åtta IKT-projekt för matematiken i skolan: empiri och analys2012Report (Other (popular science, discussion, etc.))
    Abstract [en]

    Denna rapport presenterar empiri och analys av en granskning av åtta projekt som mottagit medel från Skolverket för att tillämpa IKT i skolans matematikundervisning. Empirin är organiserad enligt aktivitetsteorin, vilken sätter relationerna mellan människa, miljö, aktiviteter och mål i förgrunden. Empirin som presenteras är till stor del sammanställda repliker från lärare och elever vid intervjuerna, organi¬sera¬de för att belysa verksamhetens olika relationer. Det betyder att presentationen är till stor del på de verksammas villkor, och på deras språk. Ett stort antal konkreta slutsatser framkommer ur detta material. En av dem är att klass¬upp-sätt¬ningar av datorer är sällan framgångsrika, på grund av att teknisk support ofta var otillräcklig, och att lärarna inte kan veta hur mycket eleverna använder datorerna till icke-skol-verk¬sam¬het. Dessa problem finns inte för interaktiva skrivtavlor. De framstår däremot som ett socialt verktyg, som gör utbyte och dialog om ämnet lättare att få till stånd. I flera fall framkom indirekt, men ändå tydligt, att utbildningen på tekniken hade varit otillräcklig, trots att lärarna inte uttryckte ett tydligt missnöje med den. Skrivtavlan ökade lärarnas motivation för samarbete. En framåtriktad slutsats är att en lärargrupp som har ett fungerande samarbete och goda didaktiska, ämnesmässiga, ledar- och relationella kompe¬tenser med en interaktiv skrivtavla har en möjlighet att komma längre med elevernas måluppfyllelse.

  • 178.
    Leonardsson, Mathilda
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Spatial Planning.
    Petersson, Josefine
    Blekinge Institute of Technology, Faculty of Engineering, Department of Spatial Planning.
    Ett mål i rätt riktning?: En studie om översiktsplanering mot miljömålet God bebyggd miljö2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Miljömålet God bebyggd miljö ingår i de 16 miljömål som Sveriges miljöarbete ska utgå ifrån för en ekologisk hållbar samhällsutveckling. Boverket är ansvarig myndighet för miljömålet God bebyggd miljö och menar att fysisk planering med fokus på översiktsplanen utgör det främsta medlet för att arbeta med miljömålet. Boverket bedömer även att målet inte kommer uppnås till år 2020 med befintliga styrmedel (Boverket 2018j). Med bakgrund i detta undersöker denna studie översiktsplaners hantering av miljömålet God bebyggd miljö samt problematik kopplat till implementering av mål. 

    Denna studie analyserar hur fyra stycken olika kommuner behandlar miljömålet God bebyggd miljögenom dess preciseringar i deras respektive översiktsplaner. Genom en kvalitativ innehållsanalys har översiktsplanerna undersökts för att utreda om kommunerna implementerar miljömålet genom visionära eller applicerbara ställningstaganden samt hur omfattande implementeringen är. Det kan utifrån detta resultat utläsas att de krav som implementeringsforskning ställer på målformulering inte tycks uppnås. Dessa krav är att ett mål måste förståssamt kunnaoch viljagenomföras för att leda till implementering.

    Utifrån studiens resultat ses framförallt en problematik kopplat till att förståmålet. Översiktsplanerna uppvisar en övervägande visionär utformning av ställningstaganden kopplat till de olika preciseringarna, vilka inte uppvisar den tydlighet som krävs för att de ska kunna implementeras. Detta antas kunna härledas från en svårighet att förstå miljömålet God bebyggd miljöoch dess preciseringar. Dessa visionära ställningstaganden fyller dock en funktion politiskt då de legitimerar planeringen och föreslagen utveckling. Undersökningen visar att kommunerna prioriterar fyra av de nio preciseringar som undersökt högre än övriga. En vilja att arbeta med miljömålets innehåll tycks finnas men utifrån denna studie kan det ifrågasättas i vilken utsträckning samt vad kommunernas motiv till detta är. Det finns en antydan till att miljömålet används som helhet i form av ett retoriskt begrepp för att dra nytta av den obestridliga positiva hållbara utveckling som det innebär trots att endast vissa delar ges faktisk styrning. Detta skulle innebära en möjlig förklaring till varför miljömålet inte uppnås. 

  • 179.
    Linde, Peter
    Blekinge Institute of Technology, The Library.
    Riktlinjer för publicering för forskare vid Blekinge Tekniska Högskola: Reviderade november 20182018Report (Other academic)
  • 180. Lindström, Fredric
    et al.
    Dahl, Mattias
    Claesson, Ingvar
    A Computational Efficient Method for Assuring Full Duplex Feeling in Hands-free Communication2003Report (Other academic)
    Abstract [en]

    This report proposes a method for obtaining satisfying "full-duplex feeling" in hands-free communication units at low computational cost. The proposed method uses a combination of an acoustic echo cancellation unit and an adaptive gain unit. The core of the method is to perform the processing of the speech signal into two separate frequency bands and to process these in different manners. Acoustic echoes in the low frequency part of the signal are cancelled by means of an acoustic echo cancellation unit, while acoustic echoes in the high frequency part are suppressed by an adaptive gain unit. The proposed method is well suited when extending the bandwidth of an existing hands-free phone. A real-time implementation of a conventional hands-free phone is compared with a real-time implementation according to the proposed method, where the later is an extended version of the first. The evaluation of the two implementations shows that the proposed method can be used to increase the quality, i.e. extended bandwidth, of a hands-free phone with only a small increase in computational demand.

  • 181. Lindén, Anna-Lisa
    Hållbar samhällsutveckling i Blekinge: Nutid och framtid2005Report (Other academic)
    Abstract [sv]

    Analysen av hållbar samhällsutveckling i Blekinge har som målsättning att analysera den demografiska situationen i de fem kommunerna i Blekinge. Befolkningens utveckling och struktur är grundläggande faktorer för en framtida socialt hållbar samhällsutveckling. Den förvärvsarbetande befolkningens förvärvsintensitet och storlek i förhållande till övriga åldersgrupper är en viktig grundförutsättning för kommunal utveckling. I nulägesbeskrivningen beräknas även försörjningskvoter och skatteunderlag i kommunerna samt en jämförelse med situationen i hela landet. Ytterst är dock arbetsmarknadsutvecklingen och tillgången på arbetsplatser avgörande för migrationen till och från kommuner och regioner. I samarbetsavtalet med Blekinge Tekniska Högskola och mig ingår dessutom att utifrån två scenarior analysera de demografiska faktorernas effekter för utvecklingen fram till år 2015. Den första av de båda framtidsbilderna bygger på att utvecklingen som den varit under 1990-talet kommer att fortsätta fram till år 2015. Den andra framtidsbilden bygger på att befolkningen kommer att öka fram till år 2015. Båda framtidsbilderna baseras på antaganden som betraktas som grundförutsättningar för befolkningsutvecklingen i Sverige och regionerna. Oavsett vilket av scenariorna man granskar närmare finner man att Karlskrona som större stad i regionen kommer att ha en betydligt mera gynnsam utveckling är övriga kommuner. De båda scenariorna, inklusive beräkningarna av ålderstruktur, försörjningskvoter och den regionala omfördelningen av befolkningen inom länet kan utgöra underlag för framtidsbedömningar kring arbetsmarknad, bostadsbehov, dimensionering av olika typer av kommunal service samt ekonomiska beräkningar av skattekraft och kommunal ekonomi.

  • 182.
    Ljung, Fredrika
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Kruse, Amelia
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Hur kan ubåtar nå nya höjder?: - En studie om kundnöjdhet på en eftermarknad för komplexa produkter2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    ABSTRACT

    Background.

    The aftermarket can be described as the activities taking place after the purchase of the product. In

    today's business environment more attention is placed on the aftermarket area due to the increased

    competition between companies and the fact that the aftermarket has proved to be a key differentiator

    for companies. Nowadays it has also been crucial to satisfy the customers need of having a

    competitive aftermarket offer. Despite the advantages for companies having such an attractive offer

    together with good customer relations there is still missing research describing how companies should

    act in order to create and develop aftermarket business in practice. This study can be seen as an

    attempt to bring knowledge and clarity to which areas to prioritize in the design of an aftermarket that

    meets customer needs.

    Objectives.

    The purpose of this study is to find out which factors, with regard to customer satisfaction,

    that are important for companies to pay attention and manage when they organize complex aftermarket activities.

    Methods.

    The study is based on a quantitative approach, given the complexity of the subject and the purpose of

    the study. As a tool, a questionnaire survey was sent out to 35 customers, which resulted in 30

    responses and a response rate of 85.7%. The respondents are existing customers at Saab Kockums in

    Sweden. The collected data was applied and analysed by Partial least squares method (PLS).

    Results.

    The results of the study show the strength of the relationship between the latent variables; company

    image, customer expectations, perceived quality and perceived value that imply customer satisfaction

    and customer loyalty. The results differed from previous studies, which indicates that the aftermarket,

    unlike a normal market, can be seen as complex.

    Conclusions.

    The study concludes that the relationship between factors that affect customer satisfaction in a

    aftermarket differs from previous studies with the explanation that the aftermarket is complex in its

    nature. Furthermore, it is concluded that customer satisfaction is also affected by the nature of the

    defence industry that is characterized by long-term, complex and costly projects with high demands on

    technology development and that supplies are compatible with previous purchases. The study's results

    also lead to the identification of development areas that provide practical guidance to companies in the

    area. The result is relevant when designing a general aftermarket strategy as well as aftermarket

    strategies with a focus on customer satisfaction.

    Keywords: Customer satisfaction, Aftermarket, Complex products, Defence industry

  • 183. Löfstål, Eva
    et al.
    Stevrin, Peter
    Börrefors, Johanna
    FoU vid MAM. Tre texter: Ekonomistyrning, innovationer och lärande2011Report (Other academic)
    Abstract [sv]

    Den här publikationen består av tre berättelser från BTH/MAM:s akade-miska verkstad. Etablerade former för att redovisa den kunskapsutveckling som högskolan bidrar till är studentuppsatser, forskningsrapporter, veten-skapliga artiklar, föredrag och böcker. Betydligt mer sällan än på ”den gamla goda tiden” händer det att akademiker deltar i det bredare offentliga samtalet. Man kan fråga sig varför. Är den akademiska verksamheten alltför specialiserad, den bredare bildnings- och medborgarskapstraditionen ett minne blott? Är incitamenten för att meritera sig alltför svaga eller finns det andra orsaker och förklaringar? I förordet till denna skrift, den första i en serie, vill jag uppmärksamma att texter från högskolans verksamhet kan - och bör - se olika ut. Kunskapen har många språk för att använda ett uttryck av som tidigare medarbetare (Åke Uhlin) myntade vid det som nu är sektionen för Management. Inte minst det faktum att högskolan sedan några decennier har blivit ålagd och tagit på sig uppgiften att samverka med det omgivande samhället, tvingar fram nya verksamhetsformer och nya sätt att offentligt redovisa resultaten av denna samverkan. Rimligtvis kommer inriktningen att kräva texter av annat slag än gängse akademiska uppsatser. Och säkerligen kommer texter, filmer, demonstrationer av verksamheter ”in real life”, uppvisande av prototyper, inspelningar och åtskilligt annat att rymmas inom högskolans offentliga publikationsformer. I dag finns det andra vägar än den tryckta skriften att ”bringa något till allmän kännedom”, en definition av ordet publikation i Svenska Akademiens Ordbok (SAOB). Samverkan med det omgivande samhället har börjat föra med sig ett nytt innehåll i högskolans verksamhet. Det är högst sannolikt att också formerna för texterna från den akademiska verkstaden kommer att finna nya vägar. I detta nummer av FoU handlar texterna om ekonomi, innovationer och lärande. Bidragen är olikartade, men en sak har de gemensamt: de ger en inblick i MAM:s akademiska verksamhet, projekt, forskning och utveckling. Eva Lövståls bidrag är en populärvetenskaplig sammanfattning av hennes avhandling. Peter Stevrins text bygger på ett föredrag vid ett seminarium, i vilket han redogör för innovationsverksamhet han har bedrivit vid sidan av sin lärargärning i företagsekonomi. En slags enpersonssamverkan kan man säga mellan högskolelektorn och uppfinnaren/entreprenören. Johanna Börrefors uppsats handlar om lärande, och om dess kontext av utbildning och undervisning. Hon är rättssociolog och praktiskt verksam lektor som har författat en transdiciplinär text om pedagogik.

  • 184. Maillard, Julien
    et al.
    Lagö, Thomas L
    Winberg, Mathias
    Fuller, Chris
    Active control of pressure pulsations in piping systems1998Report (Other academic)
    Abstract [en]

    Fluid-borne vibrations in piping systems remains a serious problem in applications such as marine vessels where mechanical fatigue and radiated noise are critical factors. In the case of pumps or hydraulic engines, the main source of vibrational energy is in the fluid axisymmetric plane wave associated with the system pressure pulsations. Due to fluid/structure coupling, this wave propagates in both the pipe wall and fluid. For high levels of pressure pulsations, the resulting radial and axial wall motion can then cause mechanical fatigue and unwanted radiated noise. Passive pulsations dampers have been used traditionally to reduce the fluid pressure pulses. The use of such passive devices is limited however in critical applications due to the resulting static pressure loss which decreases the system performance. This report describes the design and testing of a non-intrusive fluid wave actuator for the active control of pressure pulses. The actuator consists of a circumferential ring of PZT stacks acting on the pipe outside wall to generate an axisymmetric plane wave in the fluid through radial motion coupling. After briefly describing a simplified model of the actuator along with predicted performances, experimental results will show the control performance of the actuator applied to the discharge line of an oil driven hydraulic engine.

  • 185. Martinsen, Jan Kasper
    et al.
    Grahn, Håkan
    Isberg, Anders
    Evaluating Four Aspects of JavaScript Execution Behavior in Benchmarks and Web Applications2011Report (Other academic)
    Abstract [en]

    JavaScript is a dynamically typed and object-based scripting language with runtime evaluation. It has emerged as an important language for client-side computation of web applications. Previous studies have shown differences in behavior between established JavaScript benchmarks and real-world web applications. However, there still remain several important aspects to explore. In this study, we compare the JavaScript execution behavior of four application classes, i.e., four established JavaScript benchmark suites, the first pages of the top 100 sites on the Alexa list, 22 different use cases for Facebook, Twitter, and Blogger, and finally, demo applications for the emerging HTML5 standard. Our results extend previous studies by identifying the importance of anonymous and eval functions, showing that just-in-time compilation often decreases the performance of real-world web applications, and a detailed bytecode instruction mix evaluation.

  • 186.
    Mattaparthi, Sai Venkata Akshay
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    The Impact of Hexagonal grid on thePrincipal Component of Natural Images2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The visual processing in the real world is different from the digital world. Monkey’s and a human’s visual world is richer and more colourful affording sight of flies, regardless of whether they are immobile or airborne. The study of the evolutionary process of our visual system indicates the existence of variationally spatial arrangement; from densely hexagonal in the fovea to a sparse circular structure in the peripheral retina. Normally we use a rectangular grid for the processing of images. But as per the perspective of the human eyes, the new approach is to change the grid from rectangular to hexagonal. Applying hexagonal grid in image processing is very advantageous and easy for mimicking human visual system. The main advantages for using the hexagonal structure in image processing is its resemblance to the arrangement of photoreceptors in the human eyes.

    The visual processing in the real world is different from the digital world. Monkey’s and a human’s visual world is richer and more colourful affording sight of flies, regardless of whether they are immobile or airborne. The study of the evolutionary process of our visual system indicates the existence of variationally spatial arrangement; from densely hexagonal in the fovea to a sparse circular structure in the peripheral retina. Normally we use a rectangular grid for the processing of images. But as per the perspective of the human eyes, the new approach is to change the grid from rectangular to hexagonal. Applying hexagonal grid in image processing is very advantageous and easy for mimicking human visual system. The main advantages for using the hexagonal structure in image processing is its resemblance to the arrangement of photoreceptors in the human eyes.

  • 187.
    Mattsson, Michael
    Blekinge Institute of Technology, Department of Telecommunications and Mathematics.
    A Comparative Study of Three New Object-Oriented Methods1995Report (Refereed)
    Abstract [en]

    In this paper we will compare and contrast some of the newer methods with some of the established methods in the field of object-oriented software engineering. The methods re-viewed are Solution-Based Modelling, Business Object Notation and Object Behaviour Analysis. The new methods offer new solutions and ideas to issues such as object identi-fication from scenarios, traceability supporting techniques, criteria for phase completion and method support for reliability. Although all these contributions, we identified some issues, particular design for dynamic binding, that still have to be taken into account in an object-oriented method.

  • 188. Mattsson, Michael
    Second Conference on Software Engineering Research and Practice in Sweden: Proceedings, SERPS'02, Ronneby, Sweden2002Report (Other academic)
  • 189. Mattsson, Michael
    et al.
    Bosch, Jan
    Assessing Object-Oriented Application Framework Maturity: A Replicated Case Study1999Report (Other academic)
    Abstract [en]

    Object-oriented application frameworks present one of the most successful approaches to developing reusable assets in industry, but developing frameworks is both difficult and expensive. Framework generally evolve to maturity through a number of iterations due to the incorporation of new requirements and better domain understanding. Since changes to frameworks have a large impact due to the effects on the applications build based on the asset, it is important to assess the maturity of a framework. Bansiya [3, 4] presents an approach to assessing framework maturity based on a set of design metrics and formulates four statements. In this paper, we present the results of a replicated case study of the framework maturity assessment approach. Our study subject consists of four successive versions of a proprietary black-box application framework. Our findings partly support the statements formulated in the original study, but differ in some places. The differences are discussed and explanations and argumentation provided.

  • 190. Mattsson, Michael
    et al.
    Bosch, Jan
    Assessment of Three Evaluation Methods for Object-Oriented Framework Evolution1999Report (Other academic)
    Abstract [en]

    Object-oriented framework technology has become a common reuse technology in object-oriented software development. As with all software, frameworks tend to evolve. Once the framework has been deployed, new versions of a framework cause high maintenance cost for the products built with the framework. This fact in combination with the high costs of developing and evolving an object-oriented framework make it important to have controlled and predictable evolution of the framework?s functionality and costs. We present three methods 1) Evolution Identification Using Historical Information, 2) Stability Assessment and 3) Distribution of Development Effort which have been applied to between one to three different frameworks, both in the proprietary and commercial domain. The methods provide management with information which will make it possible to make well-informed decisions about the framework?s evolution, especially with respect to the following issues; identification of evolution-prone modules, framework deployment, change impact analysis, benchmarking and requirements management. Finally, the methods are compared to each other with respect to costs and benefits.

  • 191. Mattsson, Michael
    et al.
    Bosch, Jan
    Characterizing Stability in Evolving Frameworks1999Report (Other academic)
    Abstract [en]

    Object-oriented application frameworks present one of the most successful approaches to developing reusable assets in industry, but developing frameworks is both difficult and expensive. Framework generally evolve through a number of iterations due to the incorporation of new requirements and better domain understanding. Since changes to frameworks have a large impact on the applications build based on the asset, it is important to assess the stability of a framework. Recently, an approach for assessing framework stability has been proposed [3]. We have extended and applied the assessment approach on one proprietary telecommunication framework and two commercial GUI application frameworks. Based on our findings we formulate a set of hypotheses, which characterize the stability of an object-oriented application framework. We believe these hypotheses to be the most promising ones for further studies of framework stability.

  • 192. Mattsson, Michael
    et al.
    Bosch, Jan
    Observations on the Evolution of an Industrial OO Framework1999Report (Other academic)
    Abstract [en]

    Recently an approach for identifying potential modules for restructuring in large software systems using product release history was presented[4]. In this study we have adapted the original approach to better suit object-oriented frameworks and applied it to an industrial black-box framework product in the telecommunication domain. Our study shows that using historical information as a way of identifying structural shortcomings in an object-oriented system is viable and useful. The study thereby strengthens the suggested approach in [4] and demonstrates that the approach is adaptable and useful for object-oriented systems. The usefulness of the original approach has been validated through this study too.

  • 193.
    Mehta, Naresh
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Gill, Muhammad Junaid
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Agile in Multisite Software Engineering: Integration Challenges2017Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Many big organizations who are in existence since before the term agile came into existence are pursuing agile transformations and trying to integrate it with their existing structure. It has been an accepted fact that agile integrations are difficult in big organizations and many of such organizations fail the transformations. This is especially true for multisite software organizations where a traditional mix of old and new ways of working ends up creating issues. The result of such failure is the implementation of a hybrid way of working which ultimately leads to lower output and higher cost for the organizations.This paper looks at the integration challenges for multisite software engineering organizations and correlates with the theoretical findings by earlier with practical findings using survey and interviews as data collection tools. This paper specifically focusses on integration challenges involving self-organizing teams, power distribution, knowledge hiding and knowledge sharing, communications and decision making. The paper also has empirical evidence that shows that there is a communication and understanding gap between the employees and management in basic understanding of agile concepts.

  • 194. Mfoumou, Etienne
    et al.
    Kao-Walter, Sharon
    Fracture Toughness Testing of Non Standard Specimens2004Report (Other academic)
    Abstract [en]

    The fracture behavior of the main layers used in food packaging material is studied. Investigations include aluminium foil (9μm), paper board (100 μm) and Low Density Polyethylene (27 μm). The plane stress fracture toughness of each layer is derived based on a centered crack panel. Different crack sizes have been tested. A compromise (crack length) was found, at which Strip Yield Model as well as Linear Elastic Fracture Mechanics allow the validation of experimental results. Meanwhile, accurate results are obtained using the Strip Yield Model with a geometric correction. The result is also used to evaluate crack initiation from a notch when all three layers are laminated.

  • 195.
    Molin, Peter
    Blekinge Institute of Technology, Department of Telecommunications and Mathematics.
    Applying the Object-Oriented Framework Technique to a Family of Embedded Systems1996Report (Refereed)
    Abstract [en]

    This paper discusses some experiences from a project developing an object-oriented framework for a family of fire alarm system products. TeleLarm AB, a Swedish security company, initiated the project. One application has so far been generated from the framework with successful results. The released application has shown zero defects and has proved to be highly flexible. Fire alarm systems have a long lifetime and have high reliability and flexibility requirements. The most important observations presented in this paper are that the programming language C++ can be used successfully for small embedded systems; and that object-orientation and framework techniques offer flexibility and reusability in such systems. It has also been noted that design for verifiability and testability is very important, affecting as it does both maintainability and reliability.

  • 196. Molin, Peter
    Designing Reliable Systems from Reliable Components using the Context-Dependent Constraint Concept1996Report (Other academic)
    Abstract [en]

    The problem of composing a system using well-behaved components is discussed. Specifically, necessary conditions for preserving the behaviour in a system context are analysed in this paper. Such conditions are defined as Context-Dependent Constraints (CDC). A non-formal approach is taken based on common system integration errors. It is suggested that the identification and verification of CDCs should be part of any development method based on component verification. The CDCs can also serve as an aid for designing reliable and maintainable systems, where the goal of the design process is to reduce the number of CDCs.

  • 197.
    Molin, Peter
    Blekinge Institute of Technology, Department of Telecommunications and Mathematics.
    Verifying Framework-Based Applications by Establishing Conformance1996Report (Refereed)
    Abstract [en]

    The use of object-oriented frameworks is one way to increase productivity by reusing both design and code. In this paper, a framework-based application is viewed as composed by a framework part and an increment. It is difficult to relate the intended behaviour of the final application to specific increment requirements, it is therefore difficult to test the increment using traditional testing methods. Instead, the notion of increment conformance is proposed, meaning that the increment is designed conformant to the intentions of the framework designers. This intention is specified as a set of composability constraints defined as an essential part of the framework documentation. Increment conformance is established by verifying the composability constraints by means of code and design inspection. Conformance of the increment is a necessary but not sufficient condition for correct behaviour of the final application.

  • 198. Mårtensson, Björn
    et al.
    Chevul, Stefan
    Järnliden, Håkan
    Johnson, Henric
    Nilsson, Arne A.
    SuxNet – Implementation of Secure Authentication for WLAN2003Report (Other academic)
    Abstract [en]

    Wireless network equipment offers great flexibility for mobile as well as stationary computers. Clients are no longer bound by the length of a network cable. Instead wireless connectivity increases the clients’ mobility. This paper describes an implementation for wireless clients to access a wired computer network through an efficient authentication mechanism. The imple-mentation is called SuxNet, and is a contribution to IP-login [8] and Institute of Electrical and Electronics Engineers (IEEE) 802.1x [3]. The paper also explains and evaluates different security concepts such as Wired Equivalent Privacy (WEP) and IEEE 802.1x.

  • 199.
    Nagabhairava, Nitish
    Blekinge Institute of Technology.
    Implementation of Visible Light communications For Indoor Applications2018Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    In recent years there is growing research in optical wireless communication. This growing popularity isdue to several characteristics like such as large bandwidth that is also not having spectrum regulationsimposed, low cost and license-free operation. Since visible light communications (VLC) is a branch ofoptical wave communications (OWC), it is used for replacing RF communications. The other primaryreason for the use of visible light communications [1], because it uses 400 THz of unlicensed secure andradio free media for wireless communications which are 1000 times more than that of radiocommunications. For transmission of VLC, we use LED as light sources.Due to the high efficiency and less power consumption LED have replaced the oldfluorescence lamps, LED provide the dual functionality they can provide lighting and can providecommunications (transfer of data) just like Wi-Fi. In LED the on and off state is so fast that the humaneye can’t even perceive it. The on and off state can be taken as 1 and 0’s and through this we can transferthe data, this type of modulation is called OOK keying modulation it is used for single carrier modulationscheme. We can interpret the data that is received from the transceiver side with the help of thephotodiode at the receiver’s side. This communication technique can provide better security as there is nointerference, as light can't penetrate through walls leaving the data transfer to the room itself. ThroughVLC we can offer better security to data over RF communications.In this thesis, the implementation process has been performed in MATLABsimulations where we analyse different modulation techniques and parameters. We design a room withdimensions as 5m*5m*3m as length, width and height. We take multiple LED’s at the top and determinethe illumination parameters in the room due to the light emitted from the LED. The receiver is located ona desk and we calculate the number of data rates received at the receiver. The modulation techniques usedin this thesis are OOK keying modulation. We estimate the data rates in two methodologies directdetection (Line of sight) and also, we take reflections from the wall into consideration (Non-line of sight).The effect of data rates due to illumination and distance are also determined. In this thesis we transfer dataover the transmitter and receive the information at the receiver for obtained information the calculation of bit error rate (BER) is performed for both single LED and multiple LED array. The analysis is performed between the performance metrics of a single LED’s and multiple LED’s arrays to determine better-LEDarray.Key Words: OOK modulation scheme, MATLAB-Simulation, Light Emitting Diodes .

  • 200. Nethula, Shravya
    Implementation of the HadoopMapReduce algorithm on virtualizedshared storage systems2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context Hadoop is an open-source software framework developed for distributed storage and distributed processing of large sets of data. The implementation of the Hadoop MapReduce algorithm on virtualized shared storage by eliminating the concept of Hadoop Distributed File System (HDFS) is a challenging task. In this study, the Hadoop MapReduce algorithm is implemented on the Compuverde software that deals with virtualized shared storage of data.

    Objectives In this study, the effect of using virtualized shared storage with Hadoop framework is identified. The main objective of this study is to design a method to implement the Hadoop MapReduce algorithm on Compuverde software that deals with virtualized shared storage of big data. Finally, the performance of the MapReduce algorithm on Compuverde shared storage (Compuverde File System - CVFS) is evaluated and compared to the performance of the MapReduce algorithm on HDFS.

    Methods Initially a literature study is conducted to identify the effect of Hadoop implementation on virtualized shared storage. The Compuverde software is analyzed in detail during this literature study. The concepts of the MapReduce algorithms and the functioning of HDFS are scrutinized in detail. The next main research method that is adapted for this study is the implementation of a method where the Hadoop MapReduce algorithm is applied on the Compuverde software that deals with the virtualized shared storage by eliminating the HDFS. The next step is experimentation in which the performance of the implementation of the MapReduce algorithm on Compuverde shared storage (CVFS) in comparison with implementation of the MapReduce algorithm on Hadoop Distributed File System.

    Results The experiment is conducted in two different scenarios namely the CPU bound scenario and I/O bound scenario. In CPU bound scenario, the average execution time of WordCount program has a linear growth with respect to size of data set. This linear growth is observed for both the file systems, HDFS and CVFS. The same is the case with I/O bound scenario. There is linear growth for both the file systems. When the averages of execution time are plotted on the graph, both the file systems perform similarly in CPU bound scenario(multi-node environment). In the I/O bound scenario (multi-node environment), HDFS slightly out performs CVFS when the size of 1.0GB and both the file systems performs without much difference when the size of data set is 0.5GB and 1.5GB.

    Conclusions The MapReduce algorithm can be implemented on live data present in the virtualized shared storage systems without copying data into HDFS. In single node environment, distributed storage systems perform better than shared storage systems. In multi-node environment, when the CPU bound scenario is considered, both HDFS and CVFS file systems perform similarly. On the other hand, HDFS performs slightly better than CVFS for 1.0GB of data set in the I/O bound scenario. Hence we can conclude that distributed storage systems perform similar to the shared storage systems in both CPU bound and I/O bound scenarios in multi-node environment.

123456 151 - 200 of 271
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf