Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Machine Learning for Power Prediction of ASIC Digital Pre-Distorter Block
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.
2024 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Background. Application-Specific Integrated Circuits (ASICs), are special circuits made for specific tasks, often used in telecommunications for handling signals. The Digital Pre-Distorter (DPD), adjusts any signal distortions before they are sent out. Machine Learning can be used to create models that predict how much power an ASIC will use based on the DPD’s settings and can also help understand which settings have the biggest impact on power usage. The DPD block here functions asa configurable component whose parameters can be adjusted that allow for control over the total power of the ASIC.

Objectives. The objective is to develop and evaluate Machine Learning models to predict power consumption in ASICs, identify key parameters impacting power use, and assess whether data augmentation improves model’s performance with small datasets.

Methods. An experimental approach was chosen for this thesis. Five Machine Learning (ML) models - Random Forest (RF), Support Vector Regression (SVR), XGBoost, Ridge, and Lasso regression were built and evaluated using Root Mean Squared Error (RMSE), R-squared value, and Mean Absolute Error (MAE) as performance metrics. SHapley Additive ex- Planations (SHAP) values were used to identify the most significant features. For data augmentation, the best-performing model underwent self-training with confidence estimation and was re-evaluated using the same metrics.

Results. The models performed as follows, from best to worst: XGBoost Regressor was the best, followed by Ridge Regression, Lasso Regression, and then Random Forest Regressor and Support Vector Regressor, which had the lowest performance. Using SHAP values, the most significant features are identified. After applying data augmentation, the R-squared value for the XGBoost model slightly adjusted to 0.715 from 0.727. The best performing XGBoost model is used in the prototype interface to predict power for lower memory resources and the best available performance from the performance dataset of DPD is selected for that particular prediction.

Conclusions. XGBoost Regressor emerged as the best-performing model. Features related to Sleep Time, Activity, Antenna Usage and Memory are identified as the most influential features. However, the small dataset size may limit the model’s  generalizability, and data augmentation had only a minimal impact on performance metrics. Acquiring more data could lead to better, more reliable results.

Place, publisher, year, edition, pages
2024. , p. 49
Keywords [en]
Machine Learning, Application-Specific Integrated Circuits (ASICs), Self-training, Confidence Estimation
National Category
Signal Processing
Identifiers
URN: urn:nbn:se:bth-27149OAI: oai:DiVA.org:bth-27149DiVA, id: diva2:1915620
External cooperation
Ericsson
Subject / course
DV2572 Master´s Thesis in Computer Science
Educational program
DVADA Master Qualification Plan in Computer Science
Supervisors
Available from: 2024-12-03 Created: 2024-11-24 Last updated: 2025-09-30Bibliographically approved

Open Access in DiVA

fulltext(1405 kB)236 downloads
File information
File name FULLTEXT01.pdfFile size 1405 kBChecksum SHA-512
ab51f3238f754f614443bb5a366bcbbd886c16e7b5f185ff43fa8d0a763bd897c473b940ba8a355d6ee2dc4fd245811d6378678cfa4785d506cccde265d188db
Type fulltextMimetype application/pdf

By organisation
Department of Computer Science
Signal Processing

Search outside of DiVA

GoogleGoogle Scholar
Total: 238 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 263 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf