Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Novel hybrid success history intelligent optimizer with Gaussian transformation: application in CNN hyperparameter tuning
University of Petra, Jordan.
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0002-6309-2892
Tartu University, Estonia.
Sultan Qaboos University, Oman.
2024 (English)In: Cluster Computing, ISSN 1386-7857, E-ISSN 1573-7543, Vol. 27, no 3, p. 3717-3739Article in journal (Refereed) Published
Abstract [en]

This research proposes a novel Hybrid Success History Intelligent Optimizer with Gaussian Transformation (SHIOGT) for solving different complexity level optimization problems and for Convolutional Neural Network (CNNs) hyperparameter tuning. SHIOGT algorithm is designed to balance exploration and exploitation phases through the addition of Gaussian Transformation to the original Success History Intelligent Optimizer. The inclusion of Gaussian Transformation enhances solution diversity enables SHIO to avoid local optima. SHIOGT also demonstrates robustness and adaptability by dynamically adjusting its search strategy based on problem characteristics. Furthermore, the combination of Gaussian and SHIO facilitates faster convergence, accelerating the discovery of optimal or near-optimal solutions. Moreover, the hybridization of these two techniques brings a synergistic effect, enabling SHIOGT to overcome individual limitations and achieve superior performance in hyperparameter optimization tasks. SHIOGT was thoroughly assessed against an array of benchmark functions of varying complexities, demonstrating its ability to efficiently locate optimal or near-optimal solutions across different problem categories. Its robustness in tackling multimodal and deceptive landscapes and high-dimensional search spaces was particularly notable. SHIOGT has been benchmarked over 43 challenging optimization problems and have been compared with state-of-the art algorithm. Further, SHIOGT algorithm is applied to the domain of deep learning, with a case study focusing on hyperparameter tuning of CNNs. With the intelligent exploration–exploitation balance of SHIOGT, we hypothesized it could effectively optimize the CNN's hyperparameters. We evaluated the performance of SHIOGT across a variety of datasets, including MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100, with the aim of optimizing CNN model hyperparameters. The results show an impressive accuracy rate of 98% on the MNIST dataset. Similarly, the algorithm achieved a 92% accuracy rate on Fashion-MNIST, 76% on CIFAR-10, and 70% on CIFAR-100, underscoring its effectiveness across diverse datasets. © 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

Place, publisher, year, edition, pages
Springer, 2024. Vol. 27, no 3, p. 3717-3739
Keywords [en]
Differential evolution, Gaussian transformation, Hyperparameter optimization, Success history intelligent optimizer, Convolutional neural networks, Deep learning, Evolutionary algorithms, Optimal systems, Optimization, Convolutional neural network, Hyper-parameter, Hyper-parameter optimizations, Near-optimal solutions, Optimization problems, Optimizers, Transformation algorithm, Gaussian distribution
National Category
Computer Systems
Identifiers
URN: urn:nbn:se:bth-25624DOI: 10.1007/s10586-023-04161-0ISI: 001096928400001Scopus ID: 2-s2.0-85175864629OAI: oai:DiVA.org:bth-25624DiVA, id: diva2:1812828
Available from: 2023-11-17 Created: 2023-11-17 Last updated: 2024-06-12Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Alawadi, Sadi

Search in DiVA

By author/editor
Alawadi, Sadi
By organisation
Department of Computer Science
In the same journal
Cluster Computing
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 97 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf