TY - JOUR
T1 - Boosting the training of neural networks through hybrid metaheuristics
AU - Al-Betar, Mohammed Azmi
AU - Awadallah, Mohammed A.
AU - Doush, Iyad Abu
AU - Alomari, Osama Ahmad
AU - Abasi, Ammar Kamal
AU - Makhadmeh, Sharif Naser
AU - Alyasseri, Zaid Abdi Alkareem
N1 - Al-Betar, M. A., Awadallah, M. A., Doush, I. A., Alomari, O. A., Abasi, A. K., Makhadmeh, S. N., & Alyasseri, Z. A. A. (2022). Boosting the training of neural networks through hybrid metaheuristics. Cluster Computing. https://doi.org/10.1007/s10586-022-03708-x
PY - 2022
Y1 - 2022
N2 - In this paper, the learning process of multilayer perceptron (MLP) neural network is boosted using hybrid metaheuristic optimization algorithms. Normally, the learning process in MLP requires suitable settings of its weight and bias parameters. In the original version of MLP, the gradient descent algorithm is used as a learner in MLP which suffers from two chronic problems: local minima and slow convergence. In this paper, six versions of memetic algorithms (MAs) are proposed to replace gradient descent learning mechanism of MLP where adaptive β-hill climbing (AβHC) as a local search algorithm is hybridized with six population-based metaheuristics which are hybrid flower pollination algorithm, hybrid salp swarm algorithm, hybrid crow search algorithm, hybrid grey wolf optimization (HGWO), hybrid particle swarm optimization, and hybrid JAYA algorithm. This is to show the effect of the proposed MA versions on the performance of MLP. To evaluate the proposed MA versions for MLP, 15 classification benchmark problems with different size and complexity are used. The AβHC algorithm is invoked in the improvement loop of any MA version with a probability of Br parameter, which is investigated to monitor its effect on the behavior of the proposed MA versions. The Br setting which obtains the most promising results is then used to set the hybrid MA. The results show that the proposed MA versions excel the original algorithms. Moreover, HGWO outperforms all other MA versions in almost all the datasets. In a nutshell, MAs are a good choice for training MLP to produce results with high accuracy.
AB - In this paper, the learning process of multilayer perceptron (MLP) neural network is boosted using hybrid metaheuristic optimization algorithms. Normally, the learning process in MLP requires suitable settings of its weight and bias parameters. In the original version of MLP, the gradient descent algorithm is used as a learner in MLP which suffers from two chronic problems: local minima and slow convergence. In this paper, six versions of memetic algorithms (MAs) are proposed to replace gradient descent learning mechanism of MLP where adaptive β-hill climbing (AβHC) as a local search algorithm is hybridized with six population-based metaheuristics which are hybrid flower pollination algorithm, hybrid salp swarm algorithm, hybrid crow search algorithm, hybrid grey wolf optimization (HGWO), hybrid particle swarm optimization, and hybrid JAYA algorithm. This is to show the effect of the proposed MA versions on the performance of MLP. To evaluate the proposed MA versions for MLP, 15 classification benchmark problems with different size and complexity are used. The AβHC algorithm is invoked in the improvement loop of any MA version with a probability of Br parameter, which is investigated to monitor its effect on the behavior of the proposed MA versions. The Br setting which obtains the most promising results is then used to set the hybrid MA. The results show that the proposed MA versions excel the original algorithms. Moreover, HGWO outperforms all other MA versions in almost all the datasets. In a nutshell, MAs are a good choice for training MLP to produce results with high accuracy.
KW - Hybrid metaheuristics
KW - Multilayer perceptron neural network
KW - Optimization
KW - β-hill climbing
UR - http://www.scopus.com/inward/record.url?scp=85137070714&partnerID=8YFLogxK
U2 - 10.1007/s10586-022-03708-x
DO - 10.1007/s10586-022-03708-x
M3 - Article
SN - 1386-7857
VL - 26
SP - 1821
EP - 1843
JO - Cluster Computing
JF - Cluster Computing
IS - 3
ER -