Generic placeholder image

International Journal of Sensors, Wireless Communications and Control

Editor-in-Chief

ISSN (Print): 2210-3279
ISSN (Online): 2210-3287

Review Article

Revisiting Feature Ranking Methods using Information-Centric and Evolutionary Approaches: Survey

Author(s): Rashmi Gandhi*, Udayan Ghose and Hardeo Kumar Thakur

Volume 12, Issue 1, 2022

Published on: 04 February, 2021

Page: [5 - 18] Pages: 14

DOI: 10.2174/2210327911666210204142857

Price: $65

Abstract

Abstract: Feature ranking can have a severe impact on the feature selection problem. Feature ranking methods refer to the structure of features that can accept the designed data and have a positive effect on the quality of features. Moreover, accessing useful features helps in reducing cost and improving the performance of a feature ranking algorithm. There are numerous methods for ranking the features that are available in the literature. The developments of the past 20 years in the domain of knowledge research have been explored and presented in terms of relevance and various known concepts of feature ranking problems. The latest developments are mostly based on the evolutionary approaches which broadly include variations in ranking, mutual information, entropy, mutation, parent selection, genetic algorithm, etc. For a variety of algorithms based on differential evolution, it has been observed that although the suitability of the mutation operator is extremely important for feature selection yet other operators can also be considered. Therefore, the special emphasis of various algorithms is observing and reviewing the algorithms and finding new research directions: The general approach is to review a rigorous collection of articles first and then obtain the most accurate and relevant data followed by the narrow down of research questions. Research is based on the research questions. These are reviewed in four phases: designing the review, conducting the review, analyzing, and then writing the review. Threats to validity is also considered with research questions. In this paper, many feature ranking methods have been discussed to find further direction in feature ranking and differential evolution. A literature survey is performed on 93 papers to find out the performance in relevance, redundancy, correlation with differential evolution. Discussion is suitable for cascading the direction of differential evolution in integration with information-theoretic, entropy, and sparse learning. As differential evolution is multiobjective in nature so it can be incorporated with feature ranking problems. The survey is being conducted on many renowned journals and is verified with their research questions. Conclusions of the survey prove to be essential role models for multiple directions of a research entity. In this paper, a comprehensive view on the current-day understanding of the underlying mechanisms describing the impact of algorithms and review current and future research directions for use of evolutionary computations, mutual information, and entropy in the field of feature ranking is complemented by the list of promising research directions. However, there are no strict rules for the pros and cons of alternative algorithms.

Keywords: Feature ranking, evolutionary algorithms, differential evolution, entropy, survey, feature selection.

Graphical Abstract

[1]
Jaganathan P, Kuppuchamy R. A threshold fuzzy entropy based feature selection for medical database classification. Comput Biol Med 2013; 43(12): 2222-9.
[http://dx.doi.org/10.1016/j.compbiomed.2013.10.016] [PMID: 24290939]
[2]
Rehman MH, Liew CS, Abbas A, Jayaraman PP, Wah TY, Khan SU. Big data reduction methods: survey. Data Sci Eng 2016; 1(4): 265-84.
[http://dx.doi.org/10.1007/s41019-016-0022-0]
[3]
Bhandari I, Colet E, Parker J, Pines Z, Pratap R, Ramanujam K. Advanced scout: Data mining and knowledge discovery in NBA data. Data Min Knowl Discov 1997; 1(1): 121-5.
[http://dx.doi.org/10.1023/A:1009782106822]
[4]
Liu H, Motoda H. Non-myopic feature quality evaluation with (R) ReliefFComputational Methods of Feature Selection. Chapman and Hall/CRC 2007; pp. 174-97.
[http://dx.doi.org/10.1201/9781584888796-18]
[5]
Chebrolu S, Sanjeevi SG. Forward tentative selection with backward propagation of selection decision algorithm for attribute reduction in rough set theory. Int J Reason Based Intell Sys 2015; 7(3-4): 221-43.
[http://dx.doi.org/10.1504/IJRIS.2015.072950]
[6]
Liu Y, Hou T, Wang K, Liu F. Attribute Reduction of gene signal based on Improved OTSU discretization method. Chinese Automation Congress (CAC). 983-7.
[7]
Son SH, Kim JY. Data reduction for instance-based learning using entropy-based partitioning.Computational Science and Its Applications - ICCSA 2006. Berlin, Heidelberg: Springer Berlin Heidelberg 2006; pp. 590-9.
[http://dx.doi.org/10.1007/11751595_63]
[8]
Zhang Q, Yang J, Yao L. Attribute reduction based on rough approximation set in algebra and information views. In: IEEE Access. 2016; 4: pp. 5399-407.
[http://dx.doi.org/10.1109/ACCESS.2016.2600252]
[9]
Wei HL, Billings SA. Feature subset selection and ranking for data dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 2007; 29(1): 162-6.
[http://dx.doi.org/10.1109/TPAMI.2007.250607] [PMID: 17108391]
[10]
Guo SM, Yang CC. Enhancing differential evolution utilizing eigenvector based crossover operator. IEEE Trans Evol Comput 2015; 19(1): 31-49.
[http://dx.doi.org/10.1109/TEVC.2013.2297160]
[11]
Al-Ani A. Ant Colony Optimization for Feature Subset Selection. WEC 2005; (2): 35-8.
[12]
Das AK, Das S, Ghosh A. Ensemble feature selection using bi-objective genetic algorithm. Knowl Base Syst 2017; 123: 116-27.
[http://dx.doi.org/10.1016/j.knosys.2017.02.013]
[13]
Ge H, Hu T. Genetic algorithm for feature selection with mutual information. In: 2014 Seventh International Symposium on Computational Intelligence and Design. IEEE 2014; 1: pp. 116-9.
[http://dx.doi.org/10.1109/ISCID.2014.122]
[14]
Ahmed S, Zhang M, Peng L. Feature selection and classification of high dimensional mass spectrometry data: A genetic programming approach. Europ Conf EvolutionComput. Machine Learn Data Mining Bioinform 2013; 7833: 43-55.
[http://dx.doi.org/10.1007/978-3-642-37189-9_5]
[15]
Chyzhyk D, Savio A, Graña M. Evolutionary ELM wrapper feature selection for Alzheimer’s disease CAD on anatomical brain MRI. Neurocomputing 2014; 128: 73-80.
[http://dx.doi.org/10.1016/j.neucom.2013.01.065]
[16]
Umamaheswari K, Sumathi S, Sivanandam S. Neuro-genetic approaches to classification of face images with effective feature selection using hybrid classifiers. 2006 International Conference on Advanced Computing and Communications In: 2006; pp. 286-91. Mangalore, India: IEEE 2006.
[http://dx.doi.org/10.1109/ADCOM.2006.4289901]
[17]
Cervante L, Xue B, Zhang M, Shang L. Binary particle swarm optimization for feature selection: A filter based approach. 2012 IEEE Congress on Evolutionary Computation. 2012 June 1-8; Brisbane, QLD, Australia: IEEE 2012.
[18]
Xue B, Zhang M, Browne WN. Multi-objective particle swarm optimization (PSO) for feature selection. Proceedings of the 14th annual conference on Genetic and evolutionary computation 81-.
[http://dx.doi.org/10.1145/2330163.2330175]
[19]
Nguyen HB, Xue B, Liu I, Zhang M. Filter based backward elimination in wrapper based PSO for feature selection in classification. In: 2014 IEEE Congress on Evolutionary Computation (CEC). 2014; pp. 3111-8. Beijing, China: IEEE 2014.
[http://dx.doi.org/10.1109/CEC.2014.6900657]
[20]
Wang H, Yao X. Objective reduction based on nonlinear correlation information entropy. Soft Comput 2016; 20(6): 2393-407.
[http://dx.doi.org/10.1007/s00500-015-1648-y]
[21]
Marinaki M, Marinakis Y. An island memetic differential evolution algorithm for the feature selection problem Nature Inspired Cooperative Strategies for Optimization (NICSO 2013). Springer 2014; pp. 29-42.
[http://dx.doi.org/10.1007/978-3-319-01692-4_3]
[22]
Swaminathan A, Mao Y, Su GM, Gou H, Varna AL, He S, et al. Confidentiality preserving rank-ordered search. In: Proceedings of the 2007 ACM workshop on Storage security and survivability. 2007; pp. 7-12. Alexandria, Virginia, USA Association for Computing Machinery 2007.
[http://dx.doi.org/10.1145/1314313.1314316]
[23]
Ge H, Li L, Xu Y, Yang C. Quick general reduction algorithms for inconsistent decision tables. Int J Approx Reason 2017; 82: 56-80.
[http://dx.doi.org/10.1016/j.ijar.2016.11.016]
[24]
Zainal A, Maarof MA, Shamsuddin SM. Feature selection using rough set in intrusion detection. TENCON 2006-2006 IEEE Region 10 Conference; 2006 Nov 1-4; Hong Kong, China: IEEE 2007.
[http://dx.doi.org/10.1109/TENCON.2006.344210]
[25]
Bazan JG, Nguyen HS, Nguyen SH, Synak P, Wróblewski J. Rough set algorithms in classification problem Rough set methods and applications. Springer 2000; pp. 49-88.
[http://dx.doi.org/10.1007/978-3-7908-1840-6_3]
[26]
Nguyen TT, Nguyen PK. Reducing attributes in rough set theory with the view-point of mining frequent patterns. Editorial Preface 2013; 4(4): 1.
[27]
Kang HY, Lee AH. Priority mix planning for semiconductor fabrication by fuzzy AHP ranking. Expert Syst Appl 2007; 32(2): 560-70.
[http://dx.doi.org/10.1016/j.eswa.2006.01.035]
[28]
Sun L, Xu J, Xue Z, Zhang L. Rough entropy-based feature selection and its application. J Inf Comput Sci 2011; 8(9): 1525-32.
[29]
Pal SK, Shankar BU, Mitra P. Granular computing, rough entropy and object extraction. Pattern Recognit Lett 2005; 26(16): 2509-17.
[http://dx.doi.org/10.1016/j.patrec.2005.05.007]
[30]
Van Hulse J, Khoshgoftaar TM, Napolitano A, Wald R. Threshold-based feature selection techniques for high-dimensional bioinformatics data. Netw Model Anal Health Inform Bioinform 2012; 1(1-2): 47-61.
[http://dx.doi.org/10.1007/s13721-012-0006-6]
[31]
Luukka P. Feature selection using fuzzy entropy measures with similarity classifier. Expert Syst Appl 2011; 38(4): 4600-7.
[http://dx.doi.org/10.1016/j.eswa.2010.09.133]
[32]
Chao Y, Dai M, Chen K, Chen P, Zhang Z. Fuzzy entropy based multilevel image thresholding using modified gravitational search algorithm. Industrial Technology (ICIT); 2016 March 752-7; Taipei, Taiwan: IEEE 2016.
[http://dx.doi.org/10.1109/ICIT.2016.7474845]
[33]
Parkash O, Sharma P, Mahajan R. New measures of weighted fuzzy entropy and their applications for the study of maximum weighted fuzzy entropy principle. Inf Sci 2008; 178(11): 2389-95.
[http://dx.doi.org/10.1016/j.ins.2007.12.003]
[34]
Al-Sharhan S, Karray F, Gueaieb W, Basir O. Fuzzy Systems. A brief survey. The 10th IEEE International Conference.IEEE 3: 1135-9.
[http://dx.doi.org/10.1109/FUZZ.2001.1008855]
[35]
Vieira SM, Sousa JM, Kaymak U. Fuzzy criteria for feature selection. In: 2010 Second WRI Global Congress on Intelligent Systems. IEEE 2010; 189: pp. (1)1-18.
[36]
Min H, Fangfang W. Filter-wrapper hybrid method on feature selection. In: 2010 Second WRI Global Congress on Intelligent Systems. IEEE 2011; 3: pp. 98-101.
[37]
Yu S, De Backer S, Scheunders P. Genetic feature selection combined with composite fuzzy nearest neighbor classifiers for high-dimensional remote sensing data. In: Systems, Man, and Cybernetics, 2000 IEEE International Conference; . Nashville, TN, USA: IEEE 2000; pp. 1912-6.
[http://dx.doi.org/10.1109/ICSMC.2000.886392]
[38]
Alter O, Brown PO, Botstein D. Singular value decomposition for genome-wide expression data processing and modeling. Proc Natl Acad Sci USA 2000; 97(18): 10101-6.
[http://dx.doi.org/10.1073/pnas.97.18.10101] [PMID: 10963673]
[39]
Banerjee M, Pal NR. Feature selection with SVD entropy: Some modification and extension. Inf Sci 2014; 264: 118-34.
[http://dx.doi.org/10.1016/j.ins.2013.12.029]
[40]
Gheyas IA, Smith LS. Feature subset selection in large dimensionality domains. Pattern Recognit 2010; 43(1): 5-13.
[http://dx.doi.org/10.1016/j.patcog.2009.06.009]
[41]
Guyon I, Elisseeff A. An introduction to variable and feature selection. J Mach Learn Res 2003; 3: 1157-82.
[42]
Xing EP, Jordan MI, Karp RM, et al. Feature selection for high-dimensional genomic microarray data. ICML 2001; 1: 601-8.
[43]
Kamkar I, Gupta SK, Phung D, Venkatesh S. Stable feature selection for clinical prediction: exploiting ICD tree structure using Tree-Lasso. J Biomed Inform 2015; 53: 277-90.
[http://dx.doi.org/10.1016/j.jbi.2014.11.013] [PMID: 25500636]
[44]
Hong X, Haozhong C, Dongxiao N. Rough set continuous attributes discretization algorithm based on information entropy. Chinese J Computers 2005; 28(9): 1570-3.
[45]
Roffo G, Melzi S, Castellani U, Vinciarelli A. Infinite latent feature selection: A probabilistic latent graph-based ranking approach. Comput Vision Patt Recog 2017; pp. 1398-406.
[http://dx.doi.org/10.1109/ICCV.2017.156]
[46]
Zhang Z, Hancock ER. A graph-based approach to feature selection. In: International workshop on graph-based representations in pattern recognition IEEE. 2011; 6658: pp. 205-14.
[http://dx.doi.org/10.1007/978-3-642-20844-7_21]
[47]
Battiti R. Using mutual information for selecting features in supervised neural net learning. IEEE Trans Neural Netw 1994; 5(4): 537-50.
[http://dx.doi.org/10.1109/72.298224] [PMID: 18267827]
[48]
Zaffalon M, Hutter M. Robust feature selection using distributions of mutual information. Proceedings of the 18th International Conference on Uncertainty in Artificial Intelligence (UAI-2002). 577-84.
[49]
Peng H, Long F, Ding C. Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 2005; 27(8): 1226-38.
[http://dx.doi.org/10.1109/TPAMI.2005.159] [PMID: 16119262]
[50]
Estévez PA, Tesmer M, Perez CA, Zurada JM. Normalized mutual information feature selection. IEEE Trans Neural Netw 2009; 20(2): 189-201.
[http://dx.doi.org/10.1109/TNN.2008.2005601] [PMID: 19150792]
[51]
Fleuret F. Fast binary feature selection with conditional mutual information. J Mach Learn Res 2004; 5(Nov): 1531-55.
[52]
Nguyen HB, Xue B, Andreae P. Mutual information for feature selection: estimation or counting? Evol Intell 2016; 9(3): 95-110.
[http://dx.doi.org/10.1007/s12065-016-0143-4]
[53]
Brown G. A new perspective for information theoretic feature selection. Artificial intelligence and statistics 2009; 49-56.
[54]
Freeman C, Kulić D, Basir O. An evaluation of classifier-specific filter measure performance for feature selection. Pattern Recognit 2015; 48(5): 1812-26.
[http://dx.doi.org/10.1016/j.patcog.2014.11.010]
[55]
Gu Q, Li Z, Han J. Generalized fisher score for feature selection. arXiv preprint 2012; 2012: 1-8.
[56]
Chen Z, Wu C, Zhang Y, Huang Z, Ran B, Zhong M, et al. Feature selection with redundancy-complementariness dispersion. Knowl Base Syst 2015; 89: 203-17.
[http://dx.doi.org/10.1016/j.knosys.2015.07.004]
[57]
Lam W, Keung CK, Ling CX. Learning good prototypes for classification using filtering and abstraction of instances. Pattern Recognit 2002; 35(7): 1491-506.
[http://dx.doi.org/10.1016/S0031-3203(01)00131-5]
[58]
Guyon I, Elisseeff A. An introduction to variable and feature selection. Machine Learning Research 2003; 3: 1157-82.
[59]
Abedini M, Kirley M, Chiong R. Incorporating feature ranking and evolutionary methods for the classification of high-dimensional DNA microarray gene expression data. Australas Med J 2013; 6(5): 272-9.
[http://dx.doi.org/10.4066/AMJ.2013.1641] [PMID: 23745148]
[60]
Roffo G. Ranking to learn and learning to rank: On the role of ranking in pattern recognition applications. arXiv preprint 2017.
[61]
Toscano G, Landa R, Lárraga G. ón Guillermo L. On the use of stochastic ranking for parent selection in differential evolution for constrained optimization. Soft Comput 2017; 21(16): 4617-33.
[http://dx.doi.org/10.1007/s00500-016-2073-6]
[62]
Fan Z, Liu J, Sorensen T, Wang P. Improved differential evolution based on stochastic ranking for robust layout synthesis of MEMS components. IEEE Trans Ind Electron 2009; 56(4): 937-48.
[http://dx.doi.org/10.1109/TIE.2008.2006935]
[63]
Friedlander A, Neshatian K, Zhang M. Meta-learning and feature ranking using genetic programming for classification: Variable terminal weighting. Evolutionary Computation (CEC); 2011 July; 941-8. China: IEEE 2011
[64]
Stoppiglia H, Dreyfus G, Dubois R, Oussar Y. Ranking a random feature for variable and feature selection. J Mach Learn Res 2003; 3(Mar): 1399-414.
[65]
Wang L, Yu Z, Jin T, Li X, Gao S. Expert list-wise ranking method based on sparse learning. Neurocomputing 2016; 217: 119-24.
[http://dx.doi.org/10.1016/j.neucom.2016.01.111]
[66]
Gong W, Cai Z. Differential evolution with ranking-based mutation operators. IEEE Trans Cybern 2013; 43(6): 2066-81.
[http://dx.doi.org/10.1109/TCYB.2013.2239988] [PMID: 23757516]
[67]
Karegowda AG, Manjunath A, Jayaram M. Comparative study of attribute selection using gain ratio and correlation based feature selection. Int J Info Technol Know Manag 2010; 2(2): 271-7.
[68]
Abdel-Aal RE. GMDH-based feature ranking and selection for improved classification of medical data. J Biomed Inform 2005; 38(6): 456-68.
[http://dx.doi.org/10.1016/j.jbi.2005.03.003] [PMID: 16337569]
[69]
Xue B, Cervante L, Shang L, Browne WN, Zhang M. Multi-objective evolutionary algorithms for filter based feature selection in classification. Int J Artif Intell Tools 2013; 22(4): 12.
[http://dx.doi.org/10.1142/S0218213013500243]
[70]
Banerjee M, Mitra S, Banka H. Evolutionary rough feature selection in gene expression data. IEEE Trans Syst Man Cybern C 2007; 37(4): 622-32.
[http://dx.doi.org/10.1109/TSMCC.2007.897498]
[71]
Simon D. Evolutionary optimization algorithms. John Wiley & Sons 2013.
[72]
Price K, Storn RM, Lampinen JA. Differential evolution: a practical approach to global optimization. Springer Science & Business Media 2006.
[73]
Mezura-Montes E, Coello CAC, Tun-Morales EI. Simple feasibility rules and differential evolution for constrained optimization. In: Mexican International Conference on Artificial Intelligence Springer. Berlin, Heidelberg 2004; pp. 707-16.
[http://dx.doi.org/10.1007/978-3-540-24694-7_73]
[74]
Tang K, Mei Y, Yao X. Memetic algorithm with extended neighborhood search for capacitated arc routing problems. IEEE Trans Evol Comput 2009; 13(5): 1151-66.
[http://dx.doi.org/10.1109/TEVC.2009.2023449]
[75]
Leardi R, Boggia R, Terrile M. Genetic algorithms as a strategy for feature selection. J Chemometr 1992; 6(5): 267-81.
[http://dx.doi.org/10.1002/cem.1180060506]
[76]
Spolaôr N, Lorena AC, Lee HD. Multi-objective genetic algorithm evaluation in feature selection. In: International Conference on Evolutionary Multi-Criterion Optimization. IEEE 2015; 13: pp. (1)462-76.
[http://dx.doi.org/10.1007/978-3-642-19893-9_32]
[77]
Lanzi PL. Fast feature selection with genetic algorithms: a filter approach. In: Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC’97). 1997; pp. 537-40. Indianapolis, IN, USA:IEEE 2002
[http://dx.doi.org/10.1109/ICEC.1997.592369]
[78]
Xia H, Zhuang J, Yu D. Multi-objective unsupervised feature selection algorithm utilizing redundancy measure and negative epsilon-dominance for fault diagnosis. Neurocomputing 2014; 146: 113-24.
[http://dx.doi.org/10.1016/j.neucom.2014.06.075]
[79]
Chakraborty B. Genetic algorithm with fuzzy fitness function for feature selection. IEEE international symposium on industrial electronics (ISIE02) 2002; 1: 315-9.
[http://dx.doi.org/10.1109/ISIE.2002.1026085]
[80]
Slezak D. Rough sets and few-objects-many-attributes problem: the case study of analysis of gene expression data sets. 2007 Frontiers in the Converg Biosci InfoTechnol 2007; 2007: 437-2.
[81]
Duch W, Wieczorek T, Biesiada J, Blachnik M. Comparison of feature ranking methods based on information entropy. 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat No 04CH37541). 2: 1415-9.
[http://dx.doi.org/10.1109/IJCNN.2004.1380157]
[82]
Back T. Selective pressure in evolutionary algorithms: A characterization of selection mechanisms. IEEE World Congress on Computational Intelligence 1994 June; 57-62. Orlando, FL, USA: IEEE 2002
[http://dx.doi.org/10.1109/ICEC.1994.350042]
[83]
Xue B, Zhang M, Browne WN, Yao X. A survey on evolutionary computation approaches to feature selection. IEEE Trans Evol Comput 2015; 20(4): 606-26.
[http://dx.doi.org/10.1109/TEVC.2015.2504420]
[84]
Chandrashekar G, Sahin F. A survey on feature selection methods. Comput Electr Eng 2014; 40(1): 16-28.
[http://dx.doi.org/10.1016/j.compeleceng.2013.11.024]
[85]
Perc M, Gómez-Gardeñes J, Szolnoki A, Floría LM, Moreno Y. Evolutionary dynamics of group interactions on structured populations: a review. J R Soc Interface 2013; 10(80): 1-10.
[http://dx.doi.org/10.1098/rsif.2012.0997] [PMID: 23303223]
[86]
Hall MA, Holmes G. Benchmarking attribute selection techniques for discrete class data mining. IEEE Trans Knowl Data Eng 2003; 15(6): 1437-47.
[http://dx.doi.org/10.1109/TKDE.2003.1245283]
[87]
Liu R, Yang N, Ding X, Ma L. An unsupervised feature selection 274 algorithm: Laplacian score combined with distance-based entropy 275 measure. 2009 Third International Symposium on Intelligent Information Technology Application 2009 Nov 65-8; Nanchang, 276 China: IEEE 2009. 277 http ://dx.doi.org/10.1109/IITA.2009.390
[88]
Zhu L, Miao L, Zhang D. Iterative Laplacian score for feature selection.
[89]
Forman G. An extensive empirical study of feature selection metrics for text classification. J Mach Learn Res 2003; 3(Mar): 1289-305.
[90]
Lin D, Tang X. Conditional infomax learning: an integrated framework for feature extraction and fusion. European conference 283 on computer vision. Springer 2006; 3971:68-82. 284 http ://dx.doi.org/10.1007/11744023_6
[91]
Pawlak Z, Skowron A. Rudiments of rough sets 2007. Info sci 2007; 286 177(1): 3-27.
[92]
Pawlak Z, Skowron A. Rough sets and Boolean reasoning. Inf Sci 2007; 177(1): 41-73.
[http://dx.doi.org/10.1016/j.ins.2006.06.007]
[93]
Zeng A, Pan D, Zheng QL, Peng H. Knowledge acquisition based on rough set theory and principal component analysis. IEEE Intell Syst 2006; 21(2): 78-85.
[http://dx.doi.org/10.1109/MIS.2006.32]
[94]
Wang G. Algebra view and information view of rough sets theory 2001.
[95]
Zhang Q, Zhao F, Yubin X, Yang J. Constructing the optimal approximation sets of rough sets in multi-granularity spaces. Int Joint Conference on Rough Sets. 341-55. 2010 July 341-55; Barcelona, Spain: 298 IEEE 2010. 299 http ://dx.doi.org/10.1007/978-3-030-22815-6_27
[96]
Azadeh A, Saberi M, Moghaddam RT, Javanmardi L. An integrated data envelopment analysis–artificial neural network–rough set algorithm for assessment of personnel efficiency. Expert Syst Appl 2011; 38(3): 1364-73.
[http://dx.doi.org/10.1016/j.eswa.2010.07.033]
[97]
Zhou X, Chen J, Li Q, et al. Minimally invasive surgery for spontaneous supratentorial intracerebral hemorrhage: A meta-analysis of randomized controlled trials. Stroke 2012; 43(11): 2923-30.
[http://dx.doi.org/10.1161/STROKEAHA.112.667535] [PMID: 22989500]
[98]
Mardani N, Mardani A, Nilashi M. Evaluating the knowledge management practices in state welfare organization (Behzisti): Application of fuzzy MCDM approach. J Soft Comput Dec Support Sys 2017; 4(3): 1-20.
[99]
Kucukvar M, Gumus S, Egilmez G, Tatari O. Ranking the sustainability performance of pavements: An intuitionistic fuzzy decision making method. Autom Construct 2014; 40: 33-43.
[http://dx.doi.org/10.1016/j.autcon.2013.12.009]
[100]
Shannon CE, Weaver W. The mathematical theory of communication. Baltimore, MD: University of Illinois Press 1949.
[101]
Noman N, Iba H. Accelerating differential evolution using an adaptive local search. IEEE Trans Evol Comput 2008; 12(1): 107-25.
[http://dx.doi.org/10.1109/TEVC.2007.895272]
[102]
Yang Z, Yao X, He J. Making a difference to differential evolution Advances in metaheuristics for hard optimization. Springer 2007; pp. 397-414.
[103]
Zhang J, Sanderson AC. JADE: adaptive differential evolution with optional external archive. IEEE Trans Evol Comput 2009; 13(5): 945-58.
[http://dx.doi.org/10.1109/TEVC.2009.2014613]
[104]
Dorronsoro B, Bouvry P. Improving classical and decentralized differential evolution with new mutation operator and population topologies. IEEE Trans Evol Comput 2011; 15(1): 67-98.
[http://dx.doi.org/10.1109/TEVC.2010.2081369]
[105]
Liang J, Shi Z, Li D, Wierman MJ. Information entropy, rough entropy and knowledge granulation in incomplete information systems. Int J Gen Syst 2006; 35(6): 641-54.
[http://dx.doi.org/10.1080/03081070600687668]
[106]
Hancer E, Xue B, Zhang M, Karaboga D, Akay B. Pareto front feature selection based on artificial bee colony optimization. Inf Sci 2018; 422: 462-79.
[http://dx.doi.org/10.1016/j.ins.2017.09.028]
[107]
Wang H, Jing X, Niu B. A discrete bacterial algorithm for feature selection in classification of microarray gene expression cancer data. Knowl Base Syst 2017; 126: 8-19.
[http://dx.doi.org/10.1016/j.knosys.2017.04.004]
[108]
Jain A, Zongker D. Feature selection: Evaluation, application, and small sample performance. IEEE Trans Pattern Anal Mach Intell 1997; 19(2): 153-8.
[http://dx.doi.org/10.1109/34.574797]
[109]
Devijver PA, Kittler J. Pattern recognition: A statistical approach. Prentice hall 1982.
[110]
Storn R, Price K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces 1997. Global 345 optimization 1997; 11(4): 341-59.
[111]
Runarsson TP, Yao X. Stochastic ranking for constrained evolutionary optimization. IEEE Trans Evol Comput 2000; 4(3): 284-94.
[http://dx.doi.org/10.1109/4235.873238]
[112]
Deb K, Pratap A, Agarwal S, Meyarivan T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 2002; 6(2): 182-97.
[http://dx.doi.org/10.1109/4235.996017]
[113]
Liu R, Li Y, Zhang W, Jiao L. Stochastic ranking based differential evolution algorithm for constrained optimization problem. Proceedings of the first ACM/SIGEVO Summit on Genetic and Evolutionary Computation 2009 June 887-90; Shanghai, China: Association 353 for Computing Machinery 2009. 354 http ://dx.doi.org/10.1145/1543834.1543967
[114]
Koziel S, Michalewicz Z. Evolutionary algorithms, homomorphous mappings, and constrained parameter optimization. Evol Comput 1999; 7(1): 19-44.
[http://dx.doi.org/10.1162/evco.1999.7.1.19] [PMID: 10199994]
[115]
Gong MG, Jiao LC, Du HF, Ma WP. Novel evolutionary strategy based on artificial immune response for constrained optimizations. Jisuanji Xuebao/Chinese. J Comput (Taipei) 2007; 30(1): 37-47.
[116]
Jian L, Peng C, Zhiming L. Solving constrained optimization via 363 dual particle swarm optimization with stochastic ranking. 2008 International conference on computer science and software engineering; 2008 Dec 1215-8; Wuhan, China: IEEE 2008. 364 http ://dx.doi.org/10.1109/CSSE.2008.1054
[117]
Chen P, Zhao C, Li J, Liu Z. Solving the economic dispatch in power system via a modified genetic particle swarm optimization. International Joint Conference on Computational Sciences and Optimization. vol. 1: 201-4. http ://dx.doi.org/10.1109/CSO.2009.475
[118]
Meyer B. Constraint handling and stochastic ranking in ACO. 2005 IEEE Congress on Evolutionary Computation. 2683-90. 2005; 3: 2683-90. 372 http ://dx.doi.org/10.1109/CEC.2005.1555031
[119]
Mei Y, Tang K, Yao X. A memetic algorithm for periodic capacitated arc routing problem. IEEE Trans Syst Man Cybern B Cybern 2011; 41(6): 1654-67.
[http://dx.doi.org/10.1109/TSMCB.2011.2158307] [PMID: 21768050]
[120]
Fu H, Mei Y, Tang K, Zhu Y. Memetic algorithm with heuristic candidate list strategy for capacitated arc routing problem. IEEE Congress on Evolutionary Computation. 2010 July 1-8; Barcelona, Spain. IEEE 2010. 382 http ://dx.doi.org/10.1109/CEC.2010.5586042
[121]
Zhang Y, Hall LO, Goldgof DB, Sarkar S. A constrained genetic approach for computing material property of elastic objects. IEEE Trans Evol Comput 2006; 10(3): 341-57.
[http://dx.doi.org/10.1109/TEVC.2005.860767]
[122]
Zhang M, Luo W, Wang X. Differential evolution with dynamic stochastic selection for constrained optimization. Inf Sci 2008; 178(15): 3043-74.
[http://dx.doi.org/10.1016/j.ins.2008.02.014]
[123]
Fan Z, Liu J, Sorensen T, Wang P. Improved differential evolution based on stochastic ranking for robust layout synthesis of MEMS components. IEEE Trans Ind Electron 2008; 56(4): 937-48.
[124]
Khushaba RN, Al-Ani A, AlSukker A, Al-Jumaily A. A combined ant colony and differential evolution feature selection algorithm. International Conference on Ant Colony Optimization and Swarm Intelligence. 1-12. ttp ://dx.doi.org/10.1007/978-3-540-87527-7_1
[125]
Saxena DK, Duro JA, Tiwari A, Deb K, Zhang Q. Objective reduction in many-objective optimization: Linear and nonlinear algorithms. IEEE Trans Evol Comput 2012; 17(1): 77-99.
[http://dx.doi.org/10.1109/TEVC.2012.2185847]
[126]
Purshouse RC, Fleming PJ. Conflict, harmony, and independence: Relationships in evolutionary multi-criterion optimisation. International Conference on Evolutionary Multi-Criterion Optimization. 16-30. http ://dx.doi.org/10.1007/3-540-36970-8_2
[127]
Kendall MG. A new measure of rank correlation. Biometrika 1938; 30(1/2): 81-93.
[http://dx.doi.org/10.2307/2332226]
[128]
Hancer E, Xue B, Zhang M. Differential evolution for filter feature selection based on information theory and feature ranking. Knowl Base Syst 2018; 140: 103-19.
[http://dx.doi.org/10.1016/j.knosys.2017.10.028]
[129]
Lichtblau D. Relative position indexing approach Differential evolution: A handbook for global permutation-based combinatorial optimization. Springer 2009; pp. 81-120.
[http://dx.doi.org/10.1007/978-3-540-92151-6_4]
[130]
Vas P. Artificial-intelligence-based electrical machines and drives: application of fuzzy, neural, fuzzy-neural, and genetic-algorithm-based techniques. Oxford university press 1999; Vol. 45.
[131]
Qing A. Differential evolution: fundamentals and applications in electrical engineering. China: John Wiley & Sons 2009.
[http://dx.doi.org/10.1002/9780470823941]
[132]
Dzemyda G, Sakalauskas L. Large-scale data analysis using heuristic methods. Informatica 2011; 22(1): 1-10.
[http://dx.doi.org/10.15388/Informatica.2011.310]
[133]
Xue B, Cervante L, Shang L, Browne WN, Zhang M. Binary PSO and rough set theory for feature selection: A multi-objective filter based approach. 2014. Int J Comput Intell Appl 2014; 13(02):12. 425 http ://dx.doi.org/10.1142/S1469026814500096
[134]
Yuan Y, Ong YS, Gupta A, Xu H. Objective reduction in many-objective optimization: evolutionary multiobjective approaches and comprehensive analysis. IEEE Trans Evol Comput 2017; 22(2): 189-210.
[http://dx.doi.org/10.1109/TEVC.2017.2672668]
[135]
Brockhoff D, Zitzler E. Objective reduction in evolutionary multiobjective optimization: theory and applications. Evol Comput 2009; 17(2): 135-66.
[http://dx.doi.org/10.1162/evco.2009.17.2.135] [PMID: 19413486]

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy