Generic placeholder image

The Chinese Journal of Artificial Intelligence

Editor-in-Chief

ISSN (Print): 2666-7827
ISSN (Online): 2666-7835

Research Article

Swarmed Grey Wolf Optimizer

Author(s): Sumita Gulati* and Ashok Pal

Volume 1, Issue 1, 2022

Published on: 13 April, 2022

Article ID: e040322201742 Pages: 13

DOI: 10.2174/2666782701666220304140720

Abstract

Background: The Particle Swarm Optimization (PSO) algorithm is amongst the utmost favourable optimization algorithms often employed in hybrid procedures by the researchers considering simplicity, smaller count of parameters involved, convergence speed, and capability of searching global optima. The PSO algorithm acquires memory, and the collaborative swarm interactions enhances the search procedure. The high exploitation ability of PSO, which intends to locate the best solution within a limited region of the search domain, gives PSO an edge over other optimization algorithms. Whereas, low exploration ability results in a lack of assurance of proper sampling of the search domain and thus enhances the chances of rejecting a domain containing high quality solutions. Perfect harmony between exploration and exploitation abilities in the course of selection of the best solution is needed. High exploitation capacity makes PSO trapped in local minima when its initial location is far off from the global minima.

Objectives: The intent of this study is to reform this drawback of PSO of getting trapped in local minima. To upgrade the potential of Particle Swarm Optimization (PSO) to exploit and prevent PSO from getting trapped in local minima, we require an algorithm with a positive acceptable exploration capacity.

Methods: We utilized the recently developed metaheuristic Grey Wolf Optimizer (GWO), emulating the seeking and hunting techniques of Grey wolves for this purpose. In our way, the GWO has been utilized to assist PSO in a manner to unite their strengths and lessen their weaknesses. The proposed hybrid has two driving parameters to adjust and assign the preference to PSO or GWO.

Results: To test the activity of the proposed hybrid, it has been examined in comparison with the PSO and GWO methods. For this, eleven benchmark functions involving different unimodal and multimodal functions have been taken. The PSO, GWO, and SGWO pseudo codes were coded in visual basic. All the functional parameters of PSO and GWO were chosen as: w = 0.7, c1 = c2 = 2, population size = 30, number of iterations = 30. Experiments were redone 25 times for each of the methods and for each benchmark function. The methods were compared with regard to their best and worst values besides their average values and standard deviations. The obtained results revealed that in terms of average values and standard deviations, our hybrid SGWO outperformed both PSO and GWO notably.

Conclusion: The outcomes of the experiments reveal that the proposed hybrid is better in comparison to both PSO and GWO in the searchability. Though the SGWO algorithm refines result quality, the computational complexity also gets elevated. Thus, lowering the computational complexity would be another issue of future work. Moreover, we will apply the proposed hybrid in the field of water quality estimation and prediction.

Keywords: Particle swarm optimizer (PSO), grey wolf optimizer (GWO), exploitation, exploration, swarmed grey wolf optimizer (SGWO), unimodal function.

Graphical Abstract

[1]
Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput., 1997, 1(1), 67-82.
[http://dx.doi.org/10.1109/4235.585893]
[2]
Yang, X.S. Swarm intelligence based algorithms: a critical analysis. Evol. Intell., 2014, 7(1), 17-28.
[http://dx.doi.org/10.1007/s12065-013-0102-2]
[3]
Arani, B.O.; Mirzabeygi, P.; Panahi, M.S. An improved PSO algorithm with a territorial diversity-preserving scheme and enhanced exploration–exploitation balance. Swarm Evol. Comput., 2013, 11, 1-15.
[http://dx.doi.org/10.1016/j.swevo.2012.12.004]
[4]
Wang, Y.; Li, B.; Weise, T.; Wang, J.; Yuan, B.; Tian, Q. Self-adaptive learning based particle swarm optimization. Inf. Sci., 2011, 181(20), 4515-4538.
[http://dx.doi.org/10.1016/j.ins.2010.07.013]
[5]
Kessentini, S.; Barchiesi, D. In: A new strategy to improve particle swarm optimization exploration ability, Proceedings of the IEEE International Conference on 2010 Second WRI Global Congress on Intelligent Systems, Wuhan, China, December 16-17, 2010; IEEE: 2011; Vol. 1, pp. 27-30.
[http://dx.doi.org/10.1109/GCIS.2010.147]
[6]
Zhang, J.R.; Zhang, J.; Lok, T.M.; Lyu, M.R. A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl. Math. Comput., 2007, 185(2), 1026-1037.
[http://dx.doi.org/10.1016/j.amc.2006.07.025]
[7]
Ouyang, A.; Zhou, Y.; Luo, Q. In: Hybrid particle swarm optimization algorithm for solving systems of nonlinear equations, Proceedings of the IEEE International Conference on Granular Computing (GRC ’09), Nanchang, China, August 17-19, 2009; IEEE: 2009; pp. 460-465.
[http://dx.doi.org/10.1109/GRC.2009.5255079]
[8]
Mirjalili, S.; Hashim, S.Z.M. In: A new hybrid PSOGSA algorithm for function optimization, Proceedings of ICCIA 2010 - 2010 International Conference on Computer and Information Application, Tianjin, China, November 2-4, 2010; Torrens University Australia, pp. 374-377.
[9]
Yu, S.; Wu, Z.; Wang, H.; Chen, Z.; Zhong, H. A hybrid particle swarm optimization algorithm based on space transformation search and a modified velocity model. Int. J. Numer. Anal. Model., 2012, 9(2), 522-527.
[http://dx.doi.org/10.1007/978-3-642-11842-5_73]
[10]
Esmin, A.A.; Matwin, S. HPSOM: A hybrid particle swarm optimization algorithm with genetic mutation. Int. J. Innov. Comput., Inf. Control, 2013, 9(5), 1919-1934.
[11]
Chang, J.X.; Bai, T.; Huang, Q.; Yang, D.W. Optimization of water resources utilization by PSO-GA. Water Resour. Manage., 2013, 27(10), 3525-3540.
[http://dx.doi.org/10.1007/s11269-013-0362-8]
[12]
Yu, X.; Cao, J.; Shan, H.; Zhu, L.; Guo, J. An adaptive hybrid algorithm based on particle swarm optimization and differential evolution for global optimization. ScientificWorldJournal, 2014, 2014, 215472.
[http://dx.doi.org/10.1155/2014/215472] [PMID: 24688370]
[13]
Abd‐Elazim, S.M.; Ali, E.S. A hybrid particle swarm optimization and bacterial foraging for power system stability enhancement. Complexity, 2015, 21(2), 245-255.
[http://dx.doi.org/10.1002/cplx.21601]
[14]
Kumar, N.; Vidyarthi, D.P. A novel hybrid PSO–GA meta-heuristic for scheduling of DAG with communication on multiprocessor systems. Eng. Comput., 2016, 32(1), 35-47.
[http://dx.doi.org/10.1007/s00366-015-0396-z]
[15]
Garg, H. A hybrid PSO-GA algorithm for constrained optimization problems. Appl. Math. Comput., 2016, 274, 292-305.
[http://dx.doi.org/10.1016/j.amc.2015.11.001]
[16]
Ghasemi, E.; Kalhori, H.; Bagherpour, R. A new hybrid ANFIS–PSO model for prediction of peak particle velocity due to bench blasting. Eng. Comput., 2016, 32(4), 607-614.
[http://dx.doi.org/10.1007/s00366-016-0438-1]
[17]
Javidrad, F.; Nazari, M. A new hybrid particle swarm and simulated annealing stochastic optimization method. Appl. Soft Comput., 2017, 60, 634-654.
[http://dx.doi.org/10.1016/j.asoc.2017.07.023]
[18]
Ali, A.F.; Tawhid, M.A. A hybrid particle swarm optimization and genetic algorithm with population partitioning for large scale optimization problems. Ain Shams Eng. J., 2017, 8(2), 191-206.
[http://dx.doi.org/10.1016/j.asej.2016.07.008]
[19]
Hasanipanah, M.; Shahnazar, A.; Amnieh, H.B.; Armaghani, D.J. Prediction of air-overpressure caused by mine blasting using a new hybrid PSO–SVR model. Eng. Comput., 2017, 33(1), 23-31.
[http://dx.doi.org/10.1007/s00366-016-0453-2]
[20]
Kaur, S.; Mahajan, R. Hybrid meta-heuristic optimization based energy efficient protocol for wireless sensor networks. Egyptian Inform. J., 2018, 19(3), 145-150.
[http://dx.doi.org/10.1016/j.eij.2018.01.002]
[21]
Chopra, N.; Kumar, G.; Mehta, S. Hybrid GWO-PSO algorithm for solving convex economic load dispatch problem. Int J Res Adv Technol, 2016, 4(6), 37-41.
[22]
Kamboj, V.K. A novel hybrid PSO–GWO approach for unit commitment problem. Neural Comput. Appl., 2016, 27(6), 1643-1655.
[http://dx.doi.org/10.1007/s00521-015-1962-4]
[23]
Singh, N.; Singh, S.B. Hybrid algorithm of particle swarm optimization and grey wolf optimizer for improving convergence performance. J. Appl. Math., 2017, 2017, 2030489.
[http://dx.doi.org/10.1155/2017/2030489]
[24]
Şenel, F.A.; Gökçe, F.; Yüksel, A.S.; Yiğit, T. A novel hybrid PSO–GWO algorithm for optimization problems. Eng. Comput., 2019, 35(4), 1359-1373.
[http://dx.doi.org/10.1007/s00366-018-0668-5]
[25]
Teng, Z.J.; Lv, J.L.; Guo, L.W. An improved hybrid grey wolf optimization algorithm. Soft Comput., 2019, 23(15), 6617-6631.
[http://dx.doi.org/10.1007/s00500-018-3310-y]
[26]
Kennedy, J.; Eberhart, R. In: Particle swarm optimization, Proceedings of ICNN'95-international conference on neural networks, Perth, WA, Australia, 27 November – 1 December, 1995; IEEE, 2002; Vol. 4, pp. 1942-1948.
[http://dx.doi.org/10.1109/ICNN.1995.488968]
[27]
Shi, Y.; Eberhart, R. In: A modified particle swarm optimizer, Proceedings of 1998 IEEE international conference on evolutionary computation proceedings. IEEE world congress on computational intelligence (Cat. No. 98TH8360), Anchorage, AK, USA, May 4-9, 1998; IEEE, 2002; pp. 69-73.
[http://dx.doi.org/10.1109/ICEC.1998.699146]
[28]
Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw., 2014, 69, 46-61.
[http://dx.doi.org/10.1016/j.advengsoft.2013.12.007]

© 2024 Bentham Science Publishers | Privacy Policy