Boosting the training of neural networks through hybrid metaheuristics

Mohammed Azmi Al-Betar, Mohammed A. Awadallah, Iyad Abu Doush, Osama Ahmad Alomari, Ammar Kamal Abasi, Sharif Naser Makhadmeh, Zaid Abdi Alkareem Alyasseri

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

In this paper, the learning process of multilayer perceptron (MLP) neural network is boosted using hybrid metaheuristic optimization algorithms. Normally, the learning process in MLP requires suitable settings of its weight and bias parameters. In the original version of MLP, the gradient descent algorithm is used as a learner in MLP which suffers from two chronic problems: local minima and slow convergence. In this paper, six versions of memetic algorithms (MAs) are proposed to replace gradient descent learning mechanism of MLP where adaptive β-hill climbing (AβHC) as a local search algorithm is hybridized with six population-based metaheuristics which are hybrid flower pollination algorithm, hybrid salp swarm algorithm, hybrid crow search algorithm, hybrid grey wolf optimization (HGWO), hybrid particle swarm optimization, and hybrid JAYA algorithm. This is to show the effect of the proposed MA versions on the performance of MLP. To evaluate the proposed MA versions for MLP, 15 classification benchmark problems with different size and complexity are used. The AβHC algorithm is invoked in the improvement loop of any MA version with a probability of Br parameter, which is investigated to monitor its effect on the behavior of the proposed MA versions. The Br setting which obtains the most promising results is then used to set the hybrid MA. The results show that the proposed MA versions excel the original algorithms. Moreover, HGWO outperforms all other MA versions in almost all the datasets. In a nutshell, MAs are a good choice for training MLP to produce results with high accuracy.

Original languageEnglish
Pages (from-to)1821-1843
Number of pages23
JournalCluster Computing
Volume26
Issue number3
DOIs
StatePublished - 2022

Keywords

  • Hybrid metaheuristics
  • Multilayer perceptron neural network
  • Optimization
  • β-hill climbing

Fingerprint

Dive into the research topics of 'Boosting the training of neural networks through hybrid metaheuristics'. Together they form a unique fingerprint.

Cite this