AUTHORS: Jason Teo, Kim-On Chin, Shaliza Wahab, Azali Saudi, Siti Hasnah Tanalol
Download as PDF
ABSTRACT: Differential Evolution (DE) is currently one of the most popular evolutionary-based global optimization algorithms being simple to understand and implement as well as having fast convergence and robustness across a wide range of problems. Although it is classed as an evolutionary algorithm (EA), its genetic operations are atypical of such classes of algorithms. EAs typically perform crossover followed by mutation where both operations have an explicitly tunable rate of operation. However in DE, the mutation operation is conducted before the crossover operation. Moreover, although DE has a crossover rate, it does not have a mutation rate; rather it mandatorily mutates every gene in its chromosome essentially performing a 100% rate of mutation. Following this line of observation, we proceeded to experiment with a novel version of DE where the crossover and mutation operations are reversed to mimic typical EAs as well as to add in an explicitly tunable mutation rate. We have found that this simple and intuitive yet previously unexplored modification to DE is able to improve its performance, particularly in more complex search spaces with highly non-uniform fitness landscapes. Non-parametric tests show that the improvements are statistically significant
KEYWORDS: Global Optimization; Differential Evolution; Heuristic Search; Evolutionary Optimization; Genetic Operations.
REFERENCES:
[1] Janez Brest and Mirjam Sepesy Maucec. Self- ˇ adaptive differential evolution algorithm using population size reduction and three strategies. Soft Computing, 15(11):2157–2174, 2011.
[2] Uday K. Chakraborty. Advances in Differential Evolution. Springer Publishing Company, Incorporated, 1 edition, 2008.
[3] S. Das and P.N. Suganthan. Differential evolution: A survey of the state-of-the-art. IEEE Transactions on Evolutionary Computation, 15(1):831–836, 2011.
[4] Joaquin Derrac, Salvador Garcia, Daniel Molina, and Francisco Herrera. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 1(1):3 – 18, 2011.
[5] Larry J. Eshelman. The {CHC} adaptive search algorithm: How to have safe search when engaging in nontraditional genetic recombination. volume 1 of Foundations of Genetic Algorithms, pages 265 – 283. Elsevier, 1991.
[6] Larry J. Eshelman and J. David Schaffer. Realcoded genetic algorithms and interval-schemata. In L. DARRELL WHITLEY, editor, Foundations of Genetic Algorithms, volume 2 of Foundations of Genetic Algorithms, pages 187 – 202. Elsevier, 1993.
[7] Qinqin Fan and Xuefeng Yan. Self-adaptive differential evolution algorithm with discrete mutation control parameters. Expert Systems with Applications, 42(3):1551 – 1572, 2015.
[8] C. Fernandes and A. Rosa. A study of nonrandom matching and varying population size in genetic algorithm using a royal road function. In Proceedings of the 2001 Congress on Evolutionary Computation, pages 60–66, 2001.
[9] Shu-Mei Guo, Chin-Chang Yang, Hsin-Yu Chang, and Jason Sheng-Hong Tsai. Constraintactivated differential evolution for constrained min-max optimization problems: Theory and methodology. Expert Systems with Applications, 42(3):1626 – 1636, 2015.
[10] J. Kennedy and R. Eberhart. Particle swarm optimization. In Neural Networks, 1995. Proceedings., IEEE International Conference on, volume 4, pages 1942–1948, Nov 1995.
[11] Xiaofen Lu, Ke Tang, Bernhard Sendhoff, and Xin Yao. A new self-adaptation scheme for differential evolution. Neurocomputing, 146(0):2 – 16, 2014. Bridging Machine learning and Evolutionary Computation (BMLEC) Computational Collective Intelligence.
[12] R. Mallipeddi, P.N. Suganthan, Q.K. Pan, and M.F. Tasgetiren. Differential evolution algorithm with ensemble of parameters and mutation strategies. Applied Soft Computing, 11(2):1679 – 1696, 2011. The Impact of Soft Computing for the Progress of Artificial Intelligence.
[13] Ferrante Neri and Ville Tirronen. Recent advances in differential evolution: a survey and experimental analysis. Artificial Intelligence Review, 33(1-2):61–106, 2010.
[14] Filippo Neri. Cooperative concept learning by means of A distributed GA. In GECCO 2002: Proceedings of the Genetic and Evolutionary Computation Conference, New York, USA, 9-13 July 2002, pages 953–956, 2002.
[15] Filippo Neri. Relational concept learning by cooperative evolution. ACM Journal of Experimental Algorithmics, 7:12, 2002.
[16] Filippo Neri and Lorenza Saitta. Analysis of genetic algorithms evolution under pure selection. In Proceedings of the 6th International Conference on Genetic Algorithms, Pittsburgh, PA, USA, July 15-19, 1995, pages 32–41, 1995.
[17] Filippo Neri and Lorenza Saitta. Exploring the power of genetic search in learning symbolic classifiers. IEEE Trans. Pattern Anal. Mach. Intell., 18(11):1135–1141, 1996.
[18] Kenneth Price, Rainer M. Storn, and Jouni A. Lampinen. Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series). Springer-Verlag New York, Inc., Secaucus, NJ, USA, 2005.
[19] R. Storn and K. Price. Differential evolution: A simple and efficient adaptive scheme for global optimization over continuous spaces. Technical Report TR-95-012, International Computer Science Institute, Berkeley, 1995.
[20] Ponnuthurai N Suganthan, Nikolaus Hansen, Jing J Liang, Kalyanmoy Deb, YP Chen, Anne Auger, and S Tiwari. Problem definitions and evaluation criteria for the cec 2005 special session on real-parameter optimization. Technical Report, Nanyang Technological University, Singapore, May 2005 AND KanGAL Report 2005005, IIT Kanpur, India, 2005.
[21] Ming Yang, Zhihua Cai, Changhe Li, and Jing Guan. An improved adaptive differential evolution algorithm with population adaptation. In Proceedings of the 15th annual conference on Genetic and evolutionary computation, pages 145–152. ACM, 2013.