Change search
Refine search result
1234567 151 - 200 of 17072
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 151.
    Adolfsson, Victor
    Blekinge Institute of Technology, Department of Business Administration and Social Science.
    Säkerhetskapital En del av det Intellektuella Kapitalet2002Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Det saknas metoder att mäta informationssäkerhet inom företag och företagets tillgångar har förändrats från ett fokus på maskiner och råvaror till kunskap (intellektuellt kapital). Rapporten utforskar om det finns delar av företags intellektuella kapital som beskyddar företagets tillgångar och processer. Detta kapital kallas säkerhetskapital. Hur skulle företags informationssäkerhet kunna tydliggöras genom dess intellektuella kapital och hur kan begrepp inom informationssäkerhet och företagsvärdering hänga samman? Syftet med uppsatsen är att öka förståelsen hur informationssäkerhet är relaterat till intellektuellt kapital. Rapporten bygger på litteraturstudier om intellektuellt kapital och informationssäkerhet. Data har samlats in från dels börsnoterade företags årsredovisningar och dels från pressreleaser och börsinformation. Denna information har sedan analyserats både kvantitativt och kvalitativt och begreppet säkerhetskapital har växt fram. Teorier om företagsvärdering, intellektuellt kapital, risk management och informationssäkerhet presenteras och blir den referensram i vilket begreppet säkerhetskapital sätts i sitt sammanhang. Begreppet säkerhetskapital presenteras i form av modeller och situationer vari olika perspektiv på säkerhetskapital analyseras och utvärderas. Slutsatserna är främst i form av modeller och beskrivningar av hur man kan se på säkerhetskapital i förhållande till intellektuellt kapital och andra begrepp. Området är komplext men delar av resultaten (som är på en hög abstraktionsnivå) kan användas för att värdera andra typer av immateriella tillgångar.

    Download full text (pdf)
    FULLTEXT01
  • 152.
    Adolfsson, Victor
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    The State of the Art in Distributed Mobile Robotics2001Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Distributed Mobile Robotics (DMR) is a multidisciplinary research area with many open research questions. This is a survey of the state of the art in Distributed Mobile Robotics research. DMR is sometimes referred to as cooperative robotics or multi-robotic systems. DMR is about how multiple robots can cooperate to achieve goals and complete tasks better than single robot systems. It covers architectures, communication, learning, exploration and many other areas presented in this master thesis.

    Download full text (pdf)
    FULLTEXT01
  • 153. Adolfsson, Vilhelm
    et al.
    Goldberg, Max
    Jawerth, Björna
    Lennerstad, Håkan
    Localized Galerkin Estimates for Boundary Integral Equations on Lipschitz Domanis1992In: SIAM Journal on Mathematical Analysis, Vol. 5, no 23, p. 751-764Article in journal (Refereed)
    Abstract [en]

    The Galerkin method is studied for solving the boundary integral equations associated with the Laplace operator on nonsmooth domains. Convergence is established with a condition on the meshsize, which involves the local curvature on certain approximating domains. Error estimates are also proved, and the results are generalized to systems of equations.

  • 154.
    Adrian, Cajsa
    et al.
    Blekinge Institute of Technology.
    Hellgren, Evelina
    Blekinge Institute of Technology.
    Förekomsten av fysisk aktivitet och sömn hos äldre män och kvinnor med och utan depression.: En kvantitativ studie i samarbete med Swedish National Study on aging and Care Blekinge.2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Bakgrund: Medelåldern bland män och kvinnor ökar stadigt i Sverige samtidigt som psykisk ohälsa blir allt vanligare. Fysisk aktivitet och sömn är exempel på basala behov som ligger till grund för att uppnå en god hälsa. Sjuksköterskans uppgift är att motivera patienten genom att dela med sig av kunskaper och erfarenhet patienterna själva har möjlighet till förändringar och viljan till att tillfredsställa dessa behov.

    Syfte: Att undersöka förekomsten av fysisk aktivitet och sömn hos äldre män och kvinnor med och utan diagnosen depression.

    Metod: Designen är en kvantitativ deskriptiv tvärsnittsstudie i samråd med Swedish National Study on Aging and Care – Blekinges.

    Resultat: Resultatet visade att förekomsten av lätt fysisk aktivitet var vanligare än intensiv fysisk aktivitet hos äldre. Den fysiska aktiviteten utförs i högre grad av kvinnor med diagnosen depression och mer av män utan diagnosen depression. Hos de som deltog i studien förekom också sömnstörningar oftare vid depression.

    Slutsats: Sjuksköterskan bör motivera patienterna till förändringar med hjälp av omvårdnadsåtgärder. Detta kommer på lång sikt underlätta hälso- och sjukvården positivt eftersom medelåldern och den äldre befolkningen ökar. Fysisk aktivitet och god sömn är de grundläggande behov som minimerar och reducerar risken för depression hos män och kvinnor.

    Download full text (pdf)
    fulltext
  • 155.
    Adriansson, Charlotta
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Holmberg, Emma
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Arbete med och uppföljning av CSR: En jämförande studie av socialt hållbart företagande inom tre statliga företag2014Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Uppsatsens syfte har varit att ge indikationer på förekommande CSR-verktyg och arbete relaterat till socialt hållbart företagande samt dess effekter inom svenska statliga företag. Genom att relatera de statliga företagens hållbarhetsarbete med dess effekter kan uppsatsen bringa information till detta, hittills, outforskade område samt visa på vilka utfall detta socialt hållbara arbete kan medföra. Uppsatsen använder som ansats och metod jämförande dokumentstudier av tre statliga företag – Apoteket, Systembolaget och Bilprovningen – valda utifrån ett idealistiskt urval till följd av deras framträdande roll inom hållbarhetsarbete bland statliga företag. Insamling av material genomfördes med hjälp av företagens hållbarhetsredovisningar samt andra tillgängliga informationskanaler. Undersökningen visar på både likheter, skillnader och mönster mellan studiens tre statliga företag. CSR-inriktade verktyg och processer existerar inom företagens uttalade strategier och planer, dock finns ett behov att förbättra processer kring uppföljning för att försäkra sig att arbetet drivs igenom enligt planerna samt att lärdomar kan dras från de utfall arbetet medför.

    Download full text (pdf)
    FULLTEXT01
  • 156.
    Adu, Caren
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Natango Ndibuuza, Harriet Tyra
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Sjuksköterskans upplevelser av stresshantering under covid-19 - En allmän litteraturöversikt2022Independent thesis Basic level (professional degree), 10 credits / 15 HE creditsStudent thesis
    Download full text (pdf)
    Sjuksköterskans upplevelser av stresshantering under covid-19 - En allmän litteraturöversikt
  • 157.
    Adurti, Devi Abhiseshu
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Battu, Mohit
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
    Optimization of Heterogeneous Parallel Computing Systems using Machine Learning2021Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Background: Heterogeneous parallel computing systems utilize the combination of different resources CPUs and GPUs to achieve high performance and, reduced latency and energy consumption. Programming applications that target various processing units requires employing different tools and programming models/languages. Furthermore, selecting the most optimal implementation, which may either target different processing units (i.e. CPU or GPU) or implement the various algorithms, is not trivial for a given context. In this thesis, we investigate the use of machine learning to address the selection problem of various implementation variants for an application running on a heterogeneous system.

    Objectives: This study is focused on providing an approach for optimization of heterogeneous parallel computing systems at runtime by building the most efficient machine learning model to predict the optimal implementation variant of an application.

    Methods: The six machine learning models KNN, XGBoost, DTC, Random Forest Classifier, LightGBM, and SVM are trained and tested using stratified k-fold on the dataset generated from the matrix multiplication application for square matrix input dimension ranging from 16x16 to 10992x10992.

    Results: The results of each machine learning algorithm’s finding are presented through accuracy, confusion matrix, classification report for parameters precision, recall, and F-1 score, and a comparison between the machine learning models in terms of accuracy, run-time training, and run-time prediction are provided to determine the best model.

    Conclusions: The XGBoost, DTC, SVM algorithms achieved 100% accuracy. In comparison to the other machine learning models, the DTC is found to be the most suitable due to its low time required for training and prediction in predicting the optimal implementation variant of the heterogeneous system application. Hence the DTC is the best suitable algorithm for the optimization of heterogeneous parallel computing.

    Download full text (pdf)
    fulltext
  • 158.
    Advaita, Advaita
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Gali, Mani Meghala
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Performance Analysis of a MIMO Cognitive Cooperative Radio Network with Multiple AF Relays2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    With the rapid growth of wireless communications, the demand for the various multimedia services is increasing day by day leading to a deficit in the frequency spectrum resources. To overcome this problem, the concept of cognitive radio technology has been proposed which allows the unlicensed secondary user (SU) to access the licensed spectrum of the primary user (PU), thus improving the spectrum utilization. Cooperative communications is another emerging technology which is capable of overcoming many limitations in wireless systems by increasing reliability and coverage. The transmit and receive diversity techniques such as orthogonal space–time block codes (OSTBCs) and selection combining (SC) in multiple-input multiple-output (MIMO) cognitive amplify and forward relay networks help to reduce the effects of fading, increase reliability and extend radio coverage.

     

    In this thesis, we consider a MIMO cognitive cooperative radio network (CCRN) with multiple relays. The protocol used at the relays is an amplify and forward protocol. At the receiver, the SC technique is applied to combine the signals. Analytical expressions for the probability density function (PDF) and cumulative distribution function (CDF) of the signal-to-noise ratio (SNR) are derived. On this basis, the performance in terms of outage probability is obtained. Mathematica has been used to generate numerical results from the analytical expressions. The system model is simulated in MATLAB to verify the numerical results. The performance analysis of the system model is hence done in terms of outage probability.

    Download full text (pdf)
    BTH2016Advaita
  • 159.
    Advaita, Advaita
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Gali, Mani Meghala
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Chu, Thi My Chinh
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies.
    Zepernick, Hans-Juergen
    Blekinge Institute of Technology, Faculty of Computing, Department of Creative Technologies. Blekinge Inst Technol, SE-37179 Karlskrona, Sweden..
    Outage Probability of MIMO Cognitive Cooperative Radio Networks with Multiple AF Relays Using Orthogonal Space-Time Block Codes2017In: 2017 IEEE 13TH INTERNATIONAL CONFERENCE ON WIRELESS AND MOBILE COMPUTING, NETWORKING AND COMMUNICATIONS (WIMOB), IEEE , 2017, p. 84-89Conference paper (Refereed)
    Abstract [en]

    In this paper, we analyze the outage probability of multiple-input multiple-output cognitive cooperative radio networks (CCRNs) with multiple opportunistic amplify-and-forward relays. The CCRN applies underlay spectrum access accounting for the interference power constraint of a primary network and utilizes orthogonal space-time block coding to transmit multiple data streams across a number of antennas over several time slots. As such, the system exploits both time and space diversity to improve the transmission reliability over Nakagami.. fading. The CCRN applies opportunistic relaying in which the relay offering the highest signal-to-noise ratio at the receiver is selected to forward the transmit signal. Furthermore, selection combining is adopted at the secondary receiver to process the signal from the direct and relaying transmissions. To evaluate system performance, we derive an expression for the outage probability which is valid for an arbitrary number of antennas at the source, relays, and receiver of the CCRN. Selected numerical results are provided using Mathematica for analysis and Matlab for simulations, to reveal the effect of network parameters on the outage probability of the system.

  • 160.
    Aeddula, Omsri
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Data-Driven Decision Support Systems for Product Development - A Data Exploration Study Using Machine Learning2021Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Modern product development is a complex chain of events and decisions. The ongoing digital transformation of society, increasing demands in innovative solutions puts pressure on organizations to maintain, or increase competitiveness. As a consequence, a major challenge in the product development is the search for information, analysis, and the build of knowledge. This is even more challenging when the design element comprises complex structural hierarchy and limited data generation capabilities. This challenge is even more pronounced in the conceptual stage of product development where information is scarce, vague, and potentially conflicting. The ability to conduct exploration of high-level useful information using a machine learning approach in the conceptual design stage would hence enhance be of importance to support the design decision-makers, where the decisions made at this stage impact the success of overall product development process.

    The thesis aims to investigate the conceptual stage of product development, proposing methods and tools in order to support the decision-making process by the building of data-driven decision support systems. The study highlights how the data can be utilized and visualized to extract useful information in design exploration studies at the conceptual stage of product development. The ability to build data-driven decision support systems in the early phases facilitates more informed decisions.

    The thesis presents initial descriptive study findings from the empirical studies, showing the capabilities of the machine learning approaches in extracting useful information, and building data-driven decision support systems. The thesis initially describes how the linear regression model and artificial neural networks extract useful information in design exploration, providing support for the decision-makers to understand the consequences of the design choices through cause-and-effect relationships on a detailed level. Furthermore, the presented approach also provides input to a novel visualization construct intended to enhance comprehensibility within cross-functional design teams. The thesis further studies how the data can be augmented and analyzed to extract the necessary information from an existing design element to support the decision-making process in an oral healthcare context.

    Download full text (pdf)
    fulltext
  • 161.
    Aeddula, Omsri
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Flyborg, Johan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Larsson, Tobias
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Anderberg, Peter
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Sanmartin Berglund, Johan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Renvert, Stefan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health. Kristianstad University, SWE.
    A Solution with Bluetooth Low Energy Technology to Support Oral Healthcare Decisions for improving Oral Hygiene2021In: ACM International Conference Proceeding Series, Association for Computing Machinery (ACM), 2021, Vol. 1, p. 134-139Conference paper (Refereed)
    Abstract [en]

    The advent of powered toothbrushes and associated mobile health applications provides an opportunity to collect and monitor the data, however collecting reliable and standardized data from large populations has been associated with efforts from the participants and researchers. Finding a way to collect data autonomously and without the need for cooperation imparts the potential to build large knowledge banks. A solution with Bluetooth low energy technology is designed to pair a powered toothbrush with a single-core processor to collect raw data in a real-time scenario, eliminating the manual transfer of powered toothbrush data with mobile health applications. Associating powered toothbrush with a single-core processor is believed to provide reliable and comprehensible data of toothbrush use and propensities can be a guide to improve individual exhortation and general plans on oral hygiene quantifies that can prompt improved oral wellbeing. The method makes a case for an expanded chance to plan assistant capacities to protect or improve factors that influence oral wellbeing in individuals with mild cognitive impairment. The proposed framework assists with determining various parameters, which makes it adaptable and conceivable to execute in various oral care contexts 

    Download full text (pdf)
    ICMHI-OKA
  • 162.
    Aeddula, Omsri
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Gertsovich, Irina
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mathematics and Natural Sciences.
    Image-Based Localization System2020In: Proceedings of the 8th ICIECE 2019, Springer , 2020, Vol. 107, p. 535-541Conference paper (Refereed)
    Abstract [en]

    The position of a vehicle is essential for navigation of the vehicle along the desired path without a human interference. A good positioning system should have both good positioning accuracy and reliability. Global Positioning System (GPS) employed for navigation in a vehicle may lose significant power due to signal attenuation caused by construction buildings or other obstacles. In this paper, a novel real-time indoor positioning system using a static camera is presented. The proposed positioning system exploits gradient information evaluated on the camera video stream to recognize the contours of the vehicle. Subsequently, the mass center of the vehicle contour is used for simultaneous localization of the vehicle. This solution minimizes the design and computational complexity of the positioning system. The experimental evaluation of the proposed approach has demonstrated the positioned accuracy of 92.26%. © Springer Nature Singapore Pte Ltd. 2020.

  • 163.
    Aeddula, Omsri Kumar
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing.
    Automatic Image Based Positioning System2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Position of the vehicle is essential to navigate the vehicle along a desired path without any human interference. Global Positioning System (GPS) loses significant power due to signal attenuation caused by construction buildings. A good positioning system should have both good positioning accuracy and reliability. The purpose of this thesis is to implement a new positioning system using camera and examine the accuracy of the estimated vehicle position on a real-time scenario.

    The major focus of the thesis is to develop two algorithms for estimation of the position of the vehicle using a static camera and to evaluate the performance of the proposed algorithms.

    The proposed positioning system is based on two different processes. First process uses center of mass to estimate the position, while the second one utilizes gradient information to estimate the position of the vehicle.

    Two versions of the positioning systems are implemented. One version uses center of mass concept and background subtraction to estimate the position of the vehicle and the other version calculates gradients to estimate the position of the vehicle. Both algorithms are sensitive to point of view of the image i.e height of the camera. On comparing both algorithms, gradient based algorithm is less sensitive to the camera view.

    Finally, the performance is greater dependent on the height of the camera position for center of mass positioning system, as compared to the gradient positioning system but the accuracy of the systems can be improved by increasing the height of the camera. In terms of the speed of processing, the gradient positioning system is faster than the center of mass positioning system. The first algorithm, based on center of mass has 89.75\% accuracy with a standard deviation of 3 pixels and the second algorithm has an accuracy of 92.26\%. Accuracy of the system is estimated from the number of false detected positions.

    Download full text (pdf)
    fulltext
  • 164.
    Aeddula, Omsri
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Ruvald, Ryan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Wall, Johan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Larsson, Tobias
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    AI-Driven Comprehension of Autonomous Construction Equipment Behavior for Improved PSS Development2024In: Proceedings of the 57th Annual Hawaii International Conference on System Sciences, University of Hawai'i at Manoa , 2024, p. 1017-1026Conference paper (Refereed)
    Abstract [en]

    This paper presents an approach that utilizes artificial intelligence techniques to identify autonomous machine behavior patterns. The context for investigation involves a fleet of prototype autonomous haulers as part of a Product Service System solution under development in the construction and mining industry. The approach involves using deep learning-based object detection and computer vision to understand how prototype machines operate in different situations. The trained model accurately predicts and tracks the loaded and unloaded machines and helps to identify the data patterns such as course deviations, machine failures, unexpected slowdowns, battery life, machine activity, number of cycles per charge, and speed. PSS solutions hinge on efficiently allocating resources to meet the required site-level output. Solution providers can make more informed decisions at the earlier stages of development by using the AI techniques outlined in the paper, considering asset management and reallocation of resources to account for unplanned stoppages or unexpected slowdowns. Understanding machine behavioral aspects in early-stage PSS development could enable more efficient and customized PSS solutions.

    Download full text (pdf)
    HICSS_Omsri
  • 165.
    Aeddula, Omsri
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Wall, Johan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    Larsson, Tobias
    Blekinge Institute of Technology, Faculty of Engineering, Department of Mechanical Engineering.
    ARTIFICIAL NEURAL NETWORKS SUPPORTING CAUSE AND EFFECT STUDIES IN PRODUCT-SERVICE SYSTEM DEVELOPMENT2021In: Design for Tomorrow—Volume 1: Proceedings of ICoRD 2021 / [ed] Chakrabarti, A., Poovaiah, R., Bokil, P., Kant, V. (Eds.), Springer, 2021, Vol. I, article id 132Conference paper (Refereed)
    Abstract [en]

    A data analysis method based on artificial neural networks aiming to support cause-and-effect analysis in design exploration studies is presented. The method clusters and aggregates the effects of multiple design variables based on the structural hierarchy of the evaluated system. The proposed method is exemplified in a case study showing that the predictive capability of the created, clustered, a dataset is comparable to the original, unmodified, one. The proposed method is evaluated using coefficient-of-determination, root mean square error, average relative error, and mean square error. Data analysis approach with artificial neural networks is believed to significantly improve the comprehensibility of the evaluated cause-and-effect relationships studying PSS concepts in a cross-functional team and thereby assisting the difficult and resource-demanding negotiations process at the conceptual stage of the design.

    Download full text (pdf)
    ICORD_21
  • 166.
    Aerva, Manasa Reddy
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing. Axis Communications, Lund.
    Devendra Venkata Sai Mani, Chakradhar Ghantasala
    Blekinge Institute of Technology, Faculty of Engineering, Department of Applied Signal Processing. Axis Communications, Lund.
    Blue Cool Connectivity box2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The invention of closed circuit television (CCTV) has initiated a new trend in high security by video surveillance. More recently, CCTV cameras have been incorporating wireless LAN technology for data transfer purposes by using on chip memory storage until the time of update.

    In this thesis, short range communication such as Bluetooth low energy (Bluetooth smart) is used in order to perform simple I/O applications. The two important components of the project are the camera and the Bluetooth module box. An external antenna is designed for the connectivity box and the operating range of the box is deduced by using link budget. The blue cool connectivity box is assessed by defining the capabilities of the box, i.e., simple I/O operations. Field test measurements for the designed antenna provide optimum communication range. The thesis also reviews software simulation tools that are essential for antenna design and path loss modelling. The efficiency of simulated measurements versus real-time measurements are also assessed. The primary target of the thesis is to detail the design of a cost-effective antenna based on link budget calculations and perform basic I/O tasks wirelessly between the blue cool connectivity box and the camera. It is concluded that in future works, advanced operations can be added on to the existing model. It is also suggested that a model for multi floor communication can be designed.

    Download full text (pdf)
    BTH2017Chakradhar
  • 167.
    Afaq, Muhammad
    et al.
    Blekinge Institute of Technology, School of Engineering.
    Faheem, Sahibzada Muhammad
    Blekinge Institute of Technology, School of Engineering.
    Performance Analysis of Selected Cooperative Relaying Techniques2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Recently, cooperative communication has gained significant interest due to the fact that it exploits spatial diversity and provides capacity/performance gain over conventional single- input single-output (SISO) systems. A mobile node with single antenna can cooperate with a nearby mobile node having single antenna in multi-user environment to create the effect of virtual multiple antenna system. Hence, reducing the complexity associated with actual multiple antenna systems. Despite the small size and power constraints, a mobile node can still benefit from spatial diversity by employing cooperation, thus saving transmission power and increasing the coverage range of the network. In this thesis, we have selected some of relaying protocols, namely, amplify-and-forward, decode-and-forward, detect-and-forward, and selective detect-and-forward that are studied and implemented for two different relaying geometries, i.e. equidistant and collinear. Results are studied and compared with each other to show the performance of each protocol in terms of average symbol error probabilities. The considered system model has three nodes, i.e. source, relay, destination. Communicating nodes are considered to be half-duplex with single antenna for transmission and reception. The source, when communicating with the destination, broadcasts the information, which is heard by the nearby relay. The relay then uses one of the cooperation protocols. Finally, the relayed signal reaches the destination, where it is detected by maximal ratio combiner (MRC) and combined with the direct transmission for possible diversity gains. The transmission path or the channel is modeled as a frequency non-selective Rayleigh fading in the presence additive white Gaussian noise (AWGN). The effect of path loss has been observed on cooperation for collinear arrangement with exponential decay up to four. Considering equidistant arrangement, decode-and-forward shows good performance at high signal-to-noise ratio (SNR) while amplify-and-forward is very promising for very low SNR. A selective relaying scheme called selective detect-and- forward is also presented which outperforms its fixed counterparts for a wide range of SNR.

    Download full text (pdf)
    FULLTEXT01
  • 168.
    Aftab, Adnan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mufti, Muhammad Nabeel
    Blekinge Institute of Technology, School of Computing.
    Spectrum sensing through implementation of USRP22011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Scarcity of the wireless spectrum has led to the development of new techniques for better utilization of the wireless spectrum. Demand for high data rates and better voice quality is resulting in the development of new wireless standard making wireless spectrum limited than ever. In this era of wireless communication, service providers and telecom operators are faced with a dilemma where they need a large sum of the wireless spectrum to meet the ever increasing quality of service requirements of consumers. This has led to the development of spectrum sensing techniques to find the unused spectrum in the available frequency band. The results presented in this thesis will help out in developing clear understanding of spectrum sensing techniques. Comparison of different spectrum sensing approaches. The experiments carried out using USRP2 and GNU radio will help the reader to understand the concept of underutilized frequency band and its importance in Cognitive Radios.

    Download full text (pdf)
    FULLTEXT01
  • 169.
    Aftarczuk, Kamila
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Evaluation of selected data mining algorithms implemented in Medical Decision Support Systems2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The goal of this master’s thesis is to identify and evaluate data mining algorithms which are commonly implemented in modern Medical Decision Support Systems (MDSS). They are used in various healthcare units all over the world. These institutions store large amounts of medical data. This data may contain relevant medical information hidden in various patterns buried among the records. Within the research several popular MDSS’s are analyzed in order to determine the most common data mining algorithms utilized by them. Three algorithms have been identified: Naïve Bayes, Multilayer Perceptron and C4.5. Prior to the very analyses the algorithms are calibrated. Several testing configurations are tested in order to determine the best setting for the algorithms. Afterwards, an ultimate comparison of the algorithms orders them with respect to their performance. The evaluation is based on a set of performance metrics. The analyses are conducted in WEKA on five UCI medical datasets: breast cancer, hepatitis, heart disease, dermatology disease, diabetes. The analyses have shown that it is very difficult to name a single data mining algorithm to be the most suitable for the medical data. The results gained for the algorithms were very similar. However, the final evaluation of the outcomes allowed singling out the Naïve Bayes to be the best classifier for the given domain. It was followed by the Multilayer Perceptron and the C4.5.

    Download full text (pdf)
    FULLTEXT01
  • 170. Afzal, Wasif
    Lessons from applying experimentation in software engineering prediction systems2008Conference paper (Refereed)
    Abstract [en]

    Within software engineering prediction systems, experiments are undertaken primarliy to investigate relationships and to measure/compare models' accuracy. This paper discusses our experience and presents useful lessons/guidelines in experimenting with software engineering prediction systems. For this purpose, we use a typical software engineering experimentation process as a baseline. We found that the typical software engineering experimentation process in software engineering is supportive in developing prediction systems and have highlighted issues more central to the domain of software engineering prediction systems.

    Download full text (pdf)
    Lessons from applying experimentation in software engineering prediction systems
  • 171.
    Afzal, Wasif
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Metrics in Software Test Planning and Test Design Processes2007Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Software metrics plays an important role in measuring attributes that are critical to the success of a software project. Measurement of these attributes helps to make the characteristics and relationships between the attributes clearer. This in turn supports informed decision making. The field of software engineering is affected by infrequent, incomplete and inconsistent measurements. Software testing is an integral part of software development, providing opportunities for measurement of process attributes. The measurement of software testing process attributes enables the management to have better insight in to the software testing process. The aim of this thesis is to investigate the metric support for software test planning and test design processes. The study comprises of an extensive literature study and follows a methodical approach. This approach consists of two steps. The first step comprises of analyzing key phases in software testing life cycle, inputs required for starting the software test planning and design processes and metrics indicating the end of software test planning and test design processes. After establishing a basic understanding of the related concepts, the second step identifies the attributes of software test planning and test design processes including metric support for each of the identified attributes. The results of the literature survey showed that there are a number of different measurable attributes for software test planning and test design processes. The study partitioned these attributes in multiple categories for software test planning and test design processes. For each of these attributes, different existing measurements are studied. A consolidation of these measurements is presented in this thesis which is intended to provide an opportunity for management to consider improvement in these processes.

    Download full text (pdf)
    FULLTEXT01
  • 172. Afzal, Wasif
    Search-based approaches to software fault prediction and software testing2009Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software verification and validation activities are essential for software quality but also constitute a large part of software development costs. Therefore efficient and cost-effective software verification and validation activities are both a priority and a necessity considering the pressure to decrease time-to-market and intense competition faced by many, if not all, companies today. It is then perhaps not unexpected that decisions related to software quality, when to stop testing, testing schedule and testing resource allocation needs to be as accurate as possible. This thesis investigates the application of search-based techniques within two activities of software verification and validation: Software fault prediction and software testing for non-functional system properties. Software fault prediction modeling can provide support for making important decisions as outlined above. In this thesis we empirically evaluate symbolic regression using genetic programming (a search-based technique) as a potential method for software fault predictions. Using data sets from both industrial and open-source software, the strengths and weaknesses of applying symbolic regression in genetic programming are evaluated against competitive techniques. In addition to software fault prediction this thesis also consolidates available research into predictive modeling of other attributes by applying symbolic regression in genetic programming, thus presenting a broader perspective. As an extension to the application of search-based techniques within software verification and validation this thesis further investigates the extent of application of search-based techniques for testing non-functional system properties. Based on the research findings in this thesis it can be concluded that applying symbolic regression in genetic programming may be a viable technique for software fault prediction. We additionally seek literature evidence where other search-based techniques are applied for testing of non-functional system properties, hence contributing towards the growing application of search-based techniques in diverse activities within software verification and validation.

    Download full text (pdf)
    FULLTEXT01
  • 173. Afzal, Wasif
    Search-Based Prediction of Software Quality: Evaluations and Comparisons2011Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software verification and validation (V&V) activities are critical for achieving software quality; however, these activities also constitute a large part of the costs when developing software. Therefore efficient and effective software V&V activities are both a priority and a necessity considering the pressure to decrease time-to-market and the intense competition faced by many, if not all, companies today. It is then perhaps not unexpected that decisions that affects software quality, e.g., how to allocate testing resources, develop testing schedules and to decide when to stop testing, needs to be as stable and accurate as possible. The objective of this thesis is to investigate how search-based techniques can support decision-making and help control variation in software V&V activities, thereby indirectly improving software quality. Several themes in providing this support are investigated: predicting reliability of future software versions based on fault history; fault prediction to improve test phase efficiency; assignment of resources to fixing faults; and distinguishing fault-prone software modules from non-faulty ones. A common element in these investigations is the use of search-based techniques, often also called metaheuristic techniques, for supporting the V&V decision-making processes. Search-based techniques are promising since, as many problems in real world, software V&V can be formulated as optimization problems where near optimal solutions are often good enough. Moreover, these techniques are general optimization solutions that can potentially be applied across a larger variety of decision-making situations than other existing alternatives. Apart from presenting the current state of the art, in the form of a systematic literature review, and doing comparative evaluations of a variety of metaheuristic techniques on large-scale projects (both industrial and open-source), this thesis also presents methodological investigations using search-based techniques that are relevant to the task of software quality measurement and prediction. The results of applying search-based techniques in large-scale projects, while investigating a variety of research themes, show that they consistently give competitive results in comparison with existing techniques. Based on the research findings, we conclude that search-based techniques are viable techniques to use in supporting the decision-making processes within software V&V activities. The accuracy and consistency of these techniques make them important tools when developing future decision-support for effective management of software V&V activities.

    Download full text (pdf)
    FULLTEXT01
  • 174.
    Afzal, Wasif
    Blekinge Institute of Technology.
    Using faults-slip-through metric as a predictor of fault-proneness2010In: Proceedings - Asia-Pacific Software Engineering Conference, APSEC, IEEE , 2010Conference paper (Refereed)
    Abstract [en]

    The majority of software faults are present in small number of modules, therefore accurate prediction of fault-prone modules helps improve software quality by focusing testing efforts on a subset of modules. This paper evaluates the use of the faults-slip-through (FST) metric as a potential predictor of fault-prone modules. Rather than predicting the fault-prone modules for the complete test phase, the prediction is done at the specific test levels of integration and system test. We applied eight classification techniques to the task of identifying fault-prone modules, representing a variety of approaches, including a standard statistical technique for classification (logistic regression), tree-structured classifiers (C4.5 and random forests), a Bayesian technique (Na\"{i}ve Bayes), machine-learning techniques (support vector machines and back-propagation artificial neural networks) and search-based techniques (genetic programming and artificial immune recognition systems) on FST data collected from two large industrial projects from the telecommunication domain. \emph{Results:} Using area under the receiver operating characteristic (ROC) curve and the location of (PF, PD) pairs in the ROC space, GP showed impressive results in comparison with other techniques for predicting fault-prone modules at both integration and system test levels. The use of faults-slip-through metric in general provided good prediction results at the two test levels. The accuracy of GP is statistically significant in comparison with majority of the techniques for predicting fault-prone modules at integration and system test levels. (ii) Faults-slip-through metric has the potential to be a generally useful predictor of fault-proneness at integration and system test levels.

    Download full text (pdf)
    fulltext
  • 175. Afzal, Wasif
    et al.
    Ghazi, Ahmad Nauman
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Itkonen, Juha
    Torkar, Richard
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Andrews, Anneliese
    Bhatti, Khurram
    An experiment on the effectiveness and efficiency of exploratory testing2015In: Empirical Software Engineering, ISSN 1382-3256, Vol. 20, no 3, p. 844-878Article in journal (Refereed)
    Abstract [en]

    The exploratory testing (ET) approach is commonly applied in industry, but lacks scientific research. The scientific community needs quantitative results on the performance of ET taken from realistic experimental settings. The objective of this paper is to quantify the effectiveness and efficiency of ET vs. testing with documented test cases (test case based testing, TCT). We performed four controlled experiments where a total of 24 practitioners and 46 students performed manual functional testing using ET and TCT. We measured the number of identified defects in the 90-minute testing sessions, the detection difficulty, severity and types of the detected defects, and the number of false defect reports. The results show that ET found a significantly greater number of defects. ET also found significantly more defects of varying levels of difficulty, types and severity levels. However, the two testing approaches did not differ significantly in terms of the number of false defect reports submitted. We conclude that ET was more efficient than TCT in our experiment. ET was also more effective than TCT when detection difficulty, type of defects and severity levels are considered. The two approaches are comparable when it comes to the number of false defect reports submitted.

    Download full text (pdf)
    fulltext
  • 176. Afzal, Wasif
    et al.
    Torkar, Richard
    A Comparative Evaluation of Using Genetic Programming for Predicting Fault Count Data2008Conference paper (Refereed)
    Abstract [en]

    There have been a number of software reliability growth models (SRGMs) proposed in literature. Due to several reasons, such as violation of models' assumptions and complexity of models, the practitioners face difficulties in knowing which models to apply in practice. This paper presents a comparative evaluation of traditional models and use of genetic programming (GP) for modeling software reliability growth based on weekly fault count data of three different industrial projects. The motivation of using a GP approach is its ability to evolve a model based entirely on prior data without the need of making underlying assumptions. The results show the strengths of using GP for predicting fault count data.

  • 177.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Torkar, Richard
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Incorporating Metrics in an Organizational Test Strategy2008Conference paper (Refereed)
    Abstract [en]

    An organizational level test strategy needs to incorporate metrics to make the testing activities visible and available to process improvements. The majority of testing measurements that are done are based on faults found in the test execution phase. In contrast, this paper investigates metrics to support software test planning and test design processes. We have assembled metrics in these two process types to support management in carrying out evidence-based test process improvements and to incorporate suitable metrics as part of an organization level test strategy. The study is composed of two steps. The first step creates a relevant context by analyzing key phases in the software testing lifecycle, while the second step identifies the attributes of software test planning and test design processes along with metric(s) support for each of the identified attributes.

    Download full text (pdf)
    FULLTEXT01
  • 178. Afzal, Wasif
    et al.
    Torkar, Richard
    On the application of genetic programming for software engineering predictive modeling: A systematic review2011In: Expert Systems with Applications, ISSN 0957-4174 , Vol. 38, no 9, p. 11984-11997Article, review/survey (Refereed)
    Abstract [en]

    The objective of this paper is to investigate the evidence for symbolic regression using genetic programming (GP) being an effective method for prediction and estimation in software engineering, when compared with regression/machine learning models and other comparison groups (including comparisons with different improvements over the standard GP algorithm). We performed a systematic review of literature that compared genetic programming models with comparative techniques based on different independent project variables. A total of 23 primary studies were obtained after searching different information sources in the time span 1995-2008. The results of the review show that symbolic regression using genetic programming has been applied in three domains within software engineering predictive modeling: (i) Software quality classification (eight primary studies). (ii) Software cost/effort/size estimation (seven primary studies). (iii) Software fault prediction/software reliability growth modeling (eight primary studies). While there is evidence in support of using genetic programming for software quality classification, software fault prediction and software reliability growth modeling: the results are inconclusive for software cost/effort/size estimation.

  • 179. Afzal, Wasif
    et al.
    Torkar, Richard
    Suitability of Genetic Programming for Software Reliability Growth Modeling2008Conference paper (Refereed)
    Abstract [en]

    Genetic programming (GP) has been found to be effective in finding a model that fits the given data points without making any assumptions about the model structure. This makes GP a reasonable choice for software reliability growth modeling. This paper discusses the suitability of using GP for software reliability growth modeling and highlights the mechanisms that enable GP to progressively search for fitter solutions.

  • 180. Afzal, Wasif
    et al.
    Torkar, Richard
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Towards benchmarking feature subset selection methods for software fault prediction2016In: Studies in Computational Intelligence, Springer, 2016, 617, Vol. 617, p. 33-58Chapter in book (Refereed)
    Abstract [en]

    Despite the general acceptance that software engineering datasets often contain noisy, irrelevant or redundant variables, very few benchmark studies of feature subset selection (FSS) methods on real-life data from software projects have been conducted. This paper provides an empirical comparison of state-of-the-art FSS methods: information gain attribute ranking (IG); Relief (RLF); principal component analysis (PCA); correlation-based feature selection (CFS); consistencybased subset evaluation (CNS); wrapper subset evaluation (WRP); and an evolutionary computation method, genetic programming (GP), on five fault prediction datasets from the PROMISE data repository. For all the datasets, the area under the receiver operating characteristic curve—the AUC value averaged over 10-fold cross-validation runs—was calculated for each FSS method-dataset combination before and after FSS. Two diverse learning algorithms, C4.5 and naïve Bayes (NB) are used to test the attribute sets given by each FSS method. The results show that although there are no statistically significant differences between the AUC values for the different FSS methods for both C4.5 and NB, a smaller set of FSS methods (IG, RLF, GP) consistently select fewer attributes without degrading classification accuracy. We conclude that in general, FSS is beneficial as it helps improve classification accuracy of NB and C4.5. There is no single best FSS method for all datasets but IG, RLF and GP consistently select fewer attributes without degrading classification accuracy within statistically significant boundaries. © Springer International Publishing Switzerland 2016.

  • 181. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    A Systematic Mapping Study on Non-Functional Search-Based Software Testing2008Conference paper (Refereed)
  • 182. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    A systematic review of search-based testing for non-functional system properties2009In: Information and Software Technology, ISSN 0950-5849, E-ISSN 1873-6025, Vol. 51, no 6, p. 957-976Article in journal (Refereed)
    Abstract [en]

    Search-based software testing is the application of metaheuristic search techniques to generate software tests. The test adequacy criterion is transformed into a fitness function and a set of solutions in the search space are evaluated with respect to the fitness function using a metaheuristic search technique. The application of metaheuristic search techniques for testing is promising due to the fact that exhaustive testing is infeasible considering the size and complexity of software under test. Search-based software testing has been applied across the spectrum of test case design methods; this includes white-box (structural), black-box (functional) and grey-box (combination of structural and functional) testing. In addition, metaheuristic search techniques have also been applied to test non-functional properties. The overall objective of undertaking this systematic review is to examine existing work into non-functional search-based software testing (NFSBST). We are interested in types of non-functional testing targeted using metaheuristic search techniques, different fitness functions used in different types of search-based non-functional testing and challenges in the application of these techniques. The systematic review is based on a comprehensive set of 35 articles obtained after a multi-stage selection process and have been published in the time span 1996-2007. The results of the review show that metaheuristic search techniques have been applied for non-functional testing of execution time, quality of service, security, usability and safety. A variety of metaheuristic search techniques are found to be applicable for non-functional testing including simulated annealing, tabu search, genetic algorithms, ant colony methods, grammatical evolution, genetic programming (and its variants including linear genetic programming) and swarm intelligence methods. The review reports on different fitness functions used to guide the search for each of the categories of execution time, safety, usability, quality of service and security; along with a discussion of possible challenges in the application of metaheuristic search techniques.

  • 183. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    Prediction of fault count data using genetic programming2008Conference paper (Refereed)
    Abstract [en]

    Software reliability growth modeling helps in deciding project release time and managing project resources. A large number of such models have been presented in the past. Due to the existence of many models, the models' inherent complexity, and their accompanying assumptions; the selection of suitable models becomes a challenging task. This paper presents empirical results of using genetic programming (GP) for modeling software reliability growth based on weekly fault count data of three different industrial projects. The goodness of fit (adaptability) and predictive accuracy of the evolved model is measured using five different measures in an attempt to present a fair evaluation. The results show that the GP evolved model has statistically significant goodness of fit and predictive accuracy.

  • 184.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology, School of Computing.
    Torkar, Richard
    Blekinge Institute of Technology, School of Computing.
    Feldt, Robert
    Blekinge Institute of Technology, School of Computing.
    Resampling Methods in Software Quality Classification2012In: International Journal of Software Engineering and Knowledge Engineering, ISSN 0218-1940, Vol. 22, no 2, p. 203-223Article in journal (Refereed)
    Abstract [en]

    In the presence of a number of algorithms for classification and prediction in software engineering, there is a need to have a systematic way of assessing their performances. The performance assessment is typically done by some form of partitioning or resampling of the original data to alleviate biased estimation. For predictive and classification studies in software engineering, there is a lack of a definitive advice on the most appropriate resampling method to use. This is seen as one of the contributing factors for not being able to draw general conclusions on what modeling technique or set of predictor variables are the most appropriate. Furthermore, the use of a variety of resampling methods make it impossible to perform any formal meta-analysis of the primary study results. Therefore, it is desirable to examine the influence of various resampling methods and to quantify possible differences. Objective and method: This study empirically compares five common resampling methods (hold-out validation, repeated random sub-sampling, 10-fold cross-validation, leave-one-out cross-validation and non-parametric bootstrapping) using 8 publicly available data sets with genetic programming (GP) and multiple linear regression (MLR) as software quality classification approaches. Location of (PF, PD) pairs in the ROC (receiver operating characteristics) space and area under an ROC curve (AUC) are used as accuracy indicators. Results: The results show that in terms of the location of (PF, PD) pairs in the ROC space, bootstrapping results are in the preferred region for 3 of the 8 data sets for GP and for 4 of the 8 data sets for MLR. Based on the AUC measure, there are no significant differences between the different resampling methods using GP and MLR. Conclusion: There can be certain data set properties responsible for insignificant differences between the resampling methods based on AUC. These include imbalanced data sets, insignificant predictor variables and high-dimensional data sets. With the current selection of data sets and classification techniques, bootstrapping is a preferred method based on the location of (PF, PD) pair data in the ROC space. Hold-out validation is not a good choice for comparatively smaller data sets, where leave-one-out cross-validation (LOOCV) performs better. For comparatively larger data sets, 10-fold cross-validation performs better than LOOCV.

  • 185. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    Search-based prediction of fault count data2009Conference paper (Refereed)
    Abstract [en]

    Symbolic regression, an application domain of genetic programming (GP), aims to find a function whose output has some desired property, like matching target values of a particular data set. While typical regression involves finding the coefficients of a pre-defined function, symbolic regression finds a general function, with coefficients, fitting the given set of data points. The concepts of symbolic regression using genetic programming can be used to evolve a model for fault count predictions. Such a model has the advantages that the evolution is not dependent on a particular structure of the model and is also independent of any assumptions, which are common in traditional time-domain parametric software reliability growth models. This research aims at applying experiments targeting fault predictions using genetic programming and comparing the results with traditional approaches to compare efficiency gains.

    Download full text (pdf)
    FULLTEXT01
  • 186.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology.
    Torkar, Richard
    Blekinge Institute of Technology.
    Feldt, Robert
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Gorschek, Tony
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Genetic programming for cross-release fault count predictions in large and complex software projects2010In: Evolutionary Computation and Optimization Algorithms in Software Engineering: Applications and Techniques / [ed] Chis, Monica, IGI Global, Hershey, USA , 2010Chapter in book (Refereed)
    Abstract [en]

    Software fault prediction can play an important role in ensuring software quality through efficient resource allocation. This could, in turn, reduce the potentially high consequential costs due to faults. Predicting faults might be even more important with the emergence of short-timed and multiple software releases aimed at quick delivery of functionality. Previous research in software fault prediction has indicated that there is a need i) to improve the validity of results by having comparisons among number of data sets from a variety of software, ii) to use appropriate model evaluation measures and iii) to use statistical testing procedures. Moreover, cross-release prediction of faults has not yet achieved sufficient attention in the literature. In an attempt to address these concerns, this paper compares the quantitative and qualitative attributes of 7 traditional and machine-learning techniques for modeling the cross-release prediction of fault count data. The comparison is done using extensive data sets gathered from a total of 7 multi-release open-source and industrial software projects. These software projects together have several years of development and are from diverse application areas, ranging from a web browser to a robotic controller software. Our quantitative analysis suggests that genetic programming (GP) tends to have better consistency in terms of goodness of fit and accuracy across majority of data sets. It also has comparatively less model bias. Qualitatively, ease of configuration and complexity are less strong points for GP even though it shows generality and gives transparent models. Artificial neural networks did not perform as well as expected while linear regression gave average predictions in terms of goodness of fit and accuracy. Support vector machine regression and traditional software reliability growth models performed below average on most of the quantitative evaluation criteria while remained on average for most of the qualitative measures.

  • 187. Afzal, Wasif
    et al.
    Torkar, Richard
    Blekinge Institute of Technology, School of Computing.
    Feldt, Robert
    Blekinge Institute of Technology, School of Computing.
    Gorschek, Tony
    Blekinge Institute of Technology, School of Computing.
    Prediction of faults-slip-through in large software projects: an empirical evaluation2014In: Software quality journal, ISSN 0963-9314, E-ISSN 1573-1367, Vol. 22, no 1, p. 51-86Article in journal (Refereed)
    Abstract [en]

    A large percentage of the cost of rework can be avoided by finding more faults earlier in a software test process. Therefore, determination of which software test phases to focus improvement work on has considerable industrial interest. We evaluate a number of prediction techniques for predicting the number of faults slipping through to unit, function, integration, and system test phases of a large industrial project. The objective is to quantify improvement potential in different test phases by striving toward finding the faults in the right phase. The results show that a range of techniques are found to be useful in predicting the number of faults slipping through to the four test phases; however, the group of search-based techniques (genetic programming, gene expression programming, artificial immune recognition system, and particle swarm optimization-based artificial neural network) consistently give better predictions, having a representation at all of the test phases. Human predictions are consistently better at two of the four test phases. We conclude that the human predictions regarding the number of faults slipping through to various test phases can be well supported by the use of search-based techniques. A combination of human and an automated search mechanism (such as any of the search-based techniques) has the potential to provide improved prediction results.

    Download full text (pdf)
    FULLTEXT01
  • 188.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology.
    Torkar, Richard
    Blekinge Institute of Technology.
    Feldt, Robert
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wikstrand, Greger
    KnowIT YAHM Sweden AB, SWE.
    Search-based prediction of fault-slip-through in large software projects2010In: Proceedings - 2nd International Symposium on Search Based Software Engineering, SSBSE 2010, IEEE , 2010, p. 79-88Conference paper (Refereed)
    Abstract [en]

    A large percentage of the cost of rework can be avoided by finding more faults earlier in a software testing process. Therefore, determination of which software testing phases to focus improvements work on, has considerable industrial interest. This paper evaluates the use of five different techniques, namely particle swarm optimization based artificial neural networks (PSO-ANN), artificial immune recognition systems (AIRS), gene expression programming (GEP), genetic programming (GP) and multiple regression (MR), for predicting the number of faults slipping through unit, function, integration and system testing phases. The objective is to quantify improvement potential in different testing phases by striving towards finding the right faults in the right phase. We have conducted an empirical study of two large projects from a telecommunication company developing mobile platforms and wireless semiconductors. The results are compared using simple residuals, goodness of fit and absolute relative error measures. They indicate that the four search-based techniques (PSO-ANN, AIRS, GEP, GP) perform better than multiple regression for predicting the fault-slip-through for each of the four testing phases. At the unit and function testing phases, AIRS and PSO-ANN performed better while GP performed better at integration and system testing phases. The study concludes that a variety of search-based techniques are applicable for predicting the improvement potential in different testing phases with GP showing more consistent performance across two of the four test phases.

  • 189.
    Agadagba, Efeoghene
    Blekinge Institute of Technology, School of Planning and Media Design.
    Identity Construction on Social Network Sites: Facebook2011Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Digital Identity

    Download full text (pdf)
    FULLTEXT01
  • 190.
    Agadagba, Kelvin Yoreme
    Blekinge Institute of Technology, School of Planning and Media Design.
    PRIVACY AND IDENTITY MANAGERMENT ON SOCIAL NETWORKING SITES WITH REFRENCE TO FACEBOOK2011Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    According to Nicole B. Ellison and Danah M. Boyd in their article on “Social network sites: Definition, History, and Scholarship”, they defined Social Networking Sites as “Web-based services that allow individuals to (1) Construct a public or semi-public profile within a bounded system, (2) Articulate a list of other users with whom they share a connection, and (3) View and traverse their list of connections and those made by others within the system”( 2007). In other words, Social Networking Sites (SNSs) are websites that are designed to simplify communication between users who share similar activities, attitudes and interests. Today the growth and role of social networking sites has become an issue not only for the users themselves but also for scholars and industrial researchers. My aim in this research will be to explore Social Networking Sites in general. The concept of Social Networking Sites is very broad; therefore my main study will be dealing primarily with how privacy and restrictions plays a role in identity management with reference to Facebook.

    Download full text (pdf)
    FULLTEXT01
  • 191. Agardh, Johannes
    et al.
    Johansson, Martin
    Pettersson, Mårten
    Designing Future Interaction with Today's Technology1999Other (Other academic)
    Abstract [en]

    Information Technology has an increasing part of our lives. In this thesis we will discuss how technology can relate to humans and human activity. We take our standing point in concepts like Calm Technology and Tacit Interaction and examine how these visions and concepts can be used in the process of designing an artifact for a real work practice. We have done work-place studies of truck-drivers and traffic leaders regarding how they find their way to the right addresses and design a truck navigation system that aims to suit the truck drivers work practice.

    Download full text (pdf)
    FULLTEXT01
    Download full text (pdf)
    FULLTEXT02
  • 192.
    Agardh, Johannes
    et al.
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Johansson, Martin
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Pettersson, Mårten
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Designing Future Interaction with Today's Technology1999Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Information Technology has an increasing part of our lives. In this thesis we will discuss how technology can relate to humans and human activity. We take our standing point in concepts like Calm Technology and Tacit Interaction and examine how these visions and concepts can be used in the process of designing an artifact for a real work practice. We have done work-place studies of truck-drivers and traffic leaders regarding how they find their way to the right addresses and design a truck navigation system that aims to suit the truck drivers work practice.

  • 193. Agbesi, Collinson Colin Mawunyo
    Promoting Accountable Governance Through Electronic Government2016Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Electronic government (e-Government) is a purposeful system of organized delegation of power, control, management and resource allocation in a harmonized centralized or decentralized way via networks assuring efficiency, effectiveness and transparency of processes and transactions. This new phenomenon is changing the way of business and service of governments all over the world. The betterment of service to citizens as well as other groups and the efficient management of scarce resources have meant that governments seek alternatives to rendering services and efficient management processes. Analog and mechanical processes of governing and management have proved inefficient and unproductive in recent times. The search for alternative and better ways of governing and control have revealed that digital and electronic ways of governing is the best alternative and beneficial more than the mechanical process of governing. The internet, information and communication technology (ICT/IT) have registered a significant change in governments. There has also been an increased research in the area of electronic government but the field still lacks sound theoretical framework which is necessary for a better understanding of the factors influencing the adoption of electronic government systems, and the integration of various electronic government applications.

    Also the efficient and effective allocation and distribution of scarce resources has become an issue and there has been a concerted effort globally to improve the use and management of scarce resources in the last decade. The purpose of this research is to gain an in depth and better understanding of how electronic government can be used to provide accountability, security and transparency in government decision making processes in allocation and distribution of resources in the educational sector of Ghana. Research questions have been developed to help achieve the aim. The study has also provided detailed literature review, which helped to answer research questions and guide to data collection. A quantitative and qualitative research method was chosen to collect vital information and better understand the study area issue. Both self administered questionnaire as well as interviews were used to collect data relevant to the study. Also a thorough analysis of related works was conducted.

    Finally, the research concluded by addressing research questions, discussing results and providing some vital recommendations.  It was also found that electronic government is the best faster, reliable, accountable and transparent means of communication and interaction between governments, public institutions and citizens. Thus electronic government is crucial in transforming the educational sector of Ghana for better management of resources. It has also been noted that information and communication technology (ICT) is the enabling force that helps electronic government to communicate with its citizens, support e-government operation and provide efficiency, effectiveness and better services within the educational sector of Ghana.

    Download full text (pdf)
    fulltext
  • 194.
    AGBOZO, ERIC
    et al.
    Blekinge Institute of Technology, School of Management.
    YEBOAH, ERIC OMANE
    Blekinge Institute of Technology, School of Management.
    Exploring the Financial Gap for Small and Medium-Sized Enterprises (SMEs) in Ghana: A Case Study of Ghana2012Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Small and Medium Scale Enterprises (SMEs) tend by their very nature to show a far more volatile pattern of growth and earnings, with greater fluctuations, than larger companies. According to Organization for Economic Co-operation and Development (OECD) policy brief report on SME development in 2006, Financing is necessary to help Small and Medium Scale Enterprises (SMEs) set up and expand their operations, develop new products, and invest in new staff or production facilities. The study reveals that the major sources of finance for SMEs in Ghana are trade credit, bank overdraft and bank loans. The internal sources of finance and leasing or hire purchase are the minor source to which only few entrepreneurs resort to. The availability of external financing depends on various factors for instance, general economic outlook, access to public financial support including guarantees, firm-specific outlook with respect to their sales and profitability or business plan, firm’s own capital, firm’s credit history and willingness of commercial banks to provide loan. Many SMEs believe that access to internal funds, for example from retained earnings and sale of assets, bank loans, equity investments in the SMEs, trade credit, and other forms of financing, for example loan from a related company or shareholders, excluding trade credit, loan from family and friends, leasing and factoring are expected to improve the profitability and development of their businesses.

    Download full text (pdf)
    FULLTEXT01
  • 195.
    Agenyi, Benjamin
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Mobile Banking and Entrepreneurship in Developing Countries:A case study of Nigeria2013Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    The purpose of this study was to identify ways by which entrepreneurial mobile banking growth could be accelerated in developing countries. An exploratory research method was adopted to identify the facilitators and obstacles to entrepreneurial mobile banking. The finding reveals some facilitators which include government policies and efforts of donor agencies, stiffer competition among banks, need for efficiency and lower cost, telecoms focus on customers retention. While some obstacles include conservative and vague regulation, security issues, underdeveloped infrastructures, lack of interoperability and lack of basic need for financial services. The main contribution of this study is the concise identification of the facilitators and obstacles to entrepreneurial mobile banking especially in developing countries. Suggestion for further study was made. The findings could be useful to policy-makers, donor agencies and other development partners in designing and directing their policies intervention

    Download full text (pdf)
    FULLTEXT01
  • 196. Aghazadeh, Ahmad
    et al.
    Persson, G. Rutger
    Renvert, Stefan
    Blekinge Institute of Technology, School of Health Science.
    A single-centre randomized controlled clinical trial on the adjunct treatment of intra-bony defects with autogenous bone or a xenograft: results after 12 months2012In: Journal of Clinical Periodontology, ISSN 0303-6979, Vol. 39, no 7, p. 666-673Article in journal (Refereed)
    Abstract [en]

    Background Limited evidence exists on the efficacy of regenerative treatment of peri-implantitis. Material and Methods Subjects receiving antibiotics and surgical debridement were randomly assigned to placement of autogenous bone (AB) or bovine-derived xenograft (BDX) and with placement of a collagen membrane. The primary outcome was evidence of radiographic bone fill and the secondary outcomes included reductions of probing depth (PD) bleeding on probing (BOP) and suppuration. Results Twenty-two subjects were included in the AB and 23 subjects in the BDX group. Statistical analysis failed to demonstrate differences for 38/39 variables assessed at baseline. At 12 months, significant better results were obtained in the BDX group for bone levels (p < 0.001), BOP (p = 0.004), PI (p = 0.003) and suppuration (p < 0.01). When adjusting for number of implants treated per subject, a successful treatment outcome PD = 5.0 mm, no pus, no bone loss and BOP at 1/4 or less sites the likelihood of defect fill was higher in the BDX group (LR: 3.2, 95% CI: 1.010.6, p < 0.05). Conclusions Bovine xenograft provided more radiographic bone fill than AB. The success for both surgical regenerative procedures was limited. Decreases in PD, BOP, and suppuration were observed.

  • 197.
    Aghazadeh, Ahmad
    et al.
    Tand & Implantat Specialistkliniken, SWE.
    Persson, G. Rutger
    Kristianstad University, SWE.
    Stavropoulos, Andreas
    Malmo University, SWE.
    Renvert, Stefan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Reconstructive treatment of peri-implant defects - Results after three and five years2022In: Clinical Oral Implants Research, ISSN 0905-7161, E-ISSN 1600-0501, Vol. 33, no 11, p. 1114-1124Article in journal (Refereed)
    Abstract [en]

    Objectives The aim of this study was to assess the long-term efficacy of reconstructive treatment of peri-implantitis intraosseous defects. Material and Methods Peri-implant intraosseous defects were augmented using either an autogenous bone graft (AB) or a bovine-derived xenograft (BDX) in combination with a collagen membrane. Maintenance was provided every third month. Results In the AB group, 16 patients with 25 implants remained at year five. In the BDX group, 23 patients with 38 implants remained. Between baseline and year 5, bleeding on probing (BOP) and probing pocket depth (PPD) scores were reduced in both groups (p < .001). In the AB and BDX groups, mean PPD between baseline and year five was reduced by 1.7 and 2.8 mm, respectively. The difference between groups was significant (p < .001). In the AB group, the mean bone level change at implant level between baseline and years three and five was-0,2 and -0.7 mm, respectively. In the BDX group, the mean bone level change at implant level between baseline and years three and five was 1.6 and 1.6 mm, respectively. The difference between the groups was significant (p < .001). Successful treatment (no bone loss, no probing pocket depth (PPD) > 5 mm, no suppuration, maximum one implant surface with bleeding on probing (BOP) at year five) was obtained in 9/25 implants (36%) in the AB group and in 29/37 implants (78.3%) in the BDX group. Conclusions Reconstructive surgical treatment of peri-implant defects using BDX resulted in more predictable outcomes than using autogenous bone over 5 years.

    Download full text (pdf)
    fulltext
  • 198.
    Aghazadeh, Ahmad
    et al.
    Tand & Implantat Specialistkliniken, SWE.
    Persson, Rutger G.
    Kristianstad Univ, SWE.
    Renvert, Stefan
    Blekinge Institute of Technology, Faculty of Engineering, Department of Health.
    Impact of bone defect morphology on the outcome of reconstructive treatment of peri-implantitis2020In: International Journal of Implant Dentistry, E-ISSN 2198-4034, Vol. 6, no 1, article id 33Article in journal (Refereed)
    Abstract [en]

    Objectives To assess if (I) the alveolar bone defect configuration at dental implants diagnosed with peri-implantitis is related to clinical parameters at the time of surgical intervention and if (II) the outcome of surgical intervention of peri-implantitis is dependent on defect configuration at the time of treatment. Materials and methods In a prospective study, 45 individuals and 74 dental implants with >= 2 bone wall defects were treated with either an autogenous bone transplant or an exogenous bone augmentation material. Defect fill was assessed at 1 year. Results At baseline, no significant study group differences were identified. Most study implants (70.7%,n= 53) had been placed in the maxilla. Few implants were placed in molar regions. The mesial and distal crestal width at surgery was greater at 4-wall defects than at 2-wall defects (p= 0.001). Probing depths were also greater at 4-wall defects than at 2-wall defects (p= 0.01). Defect fill was correlated to initial defect depth (p< 0.001). Defect fill at 4-wall defects was significant (p< 0.05). Conclusions (I) The buccal-lingual width of the alveolar bone crest was explanatory to defect configuration, (II) 4-wall defects demonstrated more defect fill, and (III) deeper defects resulted in more defect fill.

  • 199.
    Aghdasi, AmirHossein
    Blekinge Institute of Technology, School of Engineering.
    Application of transmissibility measurement in estimation of modal parameters for a structure2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Identification of modal parameters from output-only data is studied in this work. The proposed methodology uses transmissibility functions obtained under different loading conditions to identify resonances and mode shapes. The technique is demonstrated with numerical simulations on a 2-DOF system and a cantilever beam. To underpin the simulations, a real test was done on a beam to show the efficiency of the method. A practical application of the method can be to identify structures subjected to moving loads. This is demonstrated by using FEM-model of a cantilever beam with a moving load.

    Download full text (pdf)
    FULLTEXT01
  • 200.
    Aghili, Shamim
    Blekinge Institute of Technology, School of Management.
    The Qualitative Self in Uganda: with the Western influences2013Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Self-concept within the psychological field has like many other instances excluded the African people in the research field or treated the Black people as “the other”. The present study stress the theoretical problem and, through a different methodology than the hitherto used in the field, address the self in Ugandans as Africans to cover some of the existing gap. With the assumption that the West influence on the African continent affecting the perception of the self, the self of Ugandans are investigated through deductive thematic analysis. The self, perceived to closely connected to the African culture and way of living, is affected by imperialism and the previous colonialism in the history of Uganda. Results in relation to previous research, limitations and further research are discussed.

    Download full text (pdf)
    FULLTEXT01
1234567 151 - 200 of 17072
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf