Generic placeholder image

Recent Advances in Computer Science and Communications

Editor-in-Chief

ISSN (Print): 2666-2558
ISSN (Online): 2666-2566

Review Article

Literature Review on Development of Feature Selection and Learning Mechanism for Fuzzy Rule-Based System

Author(s): Ankur Kumar* and Avinash Kaur*

Volume 16, Issue 4, 2023

Published on: 07 November, 2022

Article ID: e230822207948 Pages: 21

DOI: 10.2174/2666255816666220823163913

Price: $65

Abstract

This research is conducted to study a fuzzy system with an improved rule base. The rule base is an important part of any fuzzy inference system designed. The rules of a fuzzy system depend on the number of features selected. Selecting an optimized number of features is called feature selection. All features (parameters) play an important role in the input to the system, but they have a different impact on the system performance. Some features do not even have a positive impact on multiple classes of classifiers. Reduced features, depending on the objective to be achieved, require fewer training rules, thereby improving the accuracy of the system. Learning is an important mechanism to automate fuzzy systems. The overall purpose of the research is to design a general fuzzy expert system with improvements in the relationship between interpretability and accuracy by improving the feature selection and learning mechanism processes through nature-inspired techniques or innovating new methodologies for the same

Keywords: Feature Selection, Filter, Wrapper, Hybrid, Graph Based, SVM.

Graphical Abstract

[1]
H. Liu, and H. Motoda, "Feature selection for knowledge discovery and data mining", The Springer International Series in Engineering and Computer Science, vol. 454, 1998.
[http://dx.doi.org/10.1007/978-1-4615-5689-3]
[2]
H. Liu, and H. Motoda, Computational methods of feature selection., CRC Press: London, 2007.
[http://dx.doi.org/10.1201/9781584888796]
[3]
I. Guyon, A. Elisseeff, and A.M. De, "An introduction to variable and feature selection", J. Mach. Learn. Res., vol. 3, pp. 1157-1182, 2003.
[http://dx.doi.org/10.1016/j.aca.2011.07.027]
[4]
J.T. Tou, and R.C. González, Pattern recognition principles., Addison-Wesley Pub. Co., 1974. Available from: https://boo ks.google.com/books?id=VWQoAQAAIAAJ
[5]
S. Theodoridis, and K. Koutroumbas, Pattern recognition, Elsevier Science, 2008. Available from: https://books.google.com.mx/
[6]
M.A. Hall, Correlation-based feature selection for machine learning. Ph.D. thesis, University of Waikato Hamilton, 1999.
[7]
S.B. Kotsiantis, "Retracted Article: Feature selection for machine learning classification problems: A recent overview", Artif. Intell. Rev., vol. 42, no. 1, pp. 157-176, 2014.
[http://dx.doi.org/10.1007/s10462-011-9230-1]
[8]
S. Chakrabarti, E. Frank, R.H. Güting, J. Han, X. Jiang, M. Kamber, S.S. Lightstone, T.P. Nadeau, and R.E. Neapolitan, Data mining: Know it all. , Elsevier Science, 2008. Available from: https://books.google.com.mx/books? id=WRqZ0QsdxKkC
[9]
S. García, J. Luengo, and F. Herrera, Data preprocessing in data mining., 72nd ed Springer: New York, 2015.
[http://dx.doi.org/10.1007/978-3-319-10247-4]
[10]
A.R. Webb, Statistical pattern recognition., vol. 35. 2nd ed Wliey: New York, 2003.
[http://dx.doi.org/10.1137/1035031]
[11]
G. Ritter, Robust cluster analysis and variable selection., vol. 137. CRC Press: London, 2015.
[12]
S.K. Pal, and P. Mitra, Pattern Recognit Algorithms Data Min., 1st edn. Chapman and Hall/CRC: London, 2004.
[13]
I. Jolliffe, Principal component analysis., Springer: Berlin, 1986.
[http://dx.doi.org/10.1007/978-1-4757-1904-8]
[14]
J. Ye, "Least squares linear discriminant analysis", Proceedings of the 24th International Conference On Machine Learning, ICML’07, 2007, pp. 1087-1093.
[http://dx.doi.org/10.1145/1273496.1273633]
[15]
G.H. Golub, and C. Reinsch, "Singular value decomposition and least squares solutions", Numer. Math., vol. 14, no. 5, pp. 403-420, 1970.
[16]
E. Hancer, B. Xue, and M. Zhang, "Differential evolution for filter feature selection based on information theory and feature ranking", Knowl. Base. Syst., vol. 140, pp. 103-119, 2018.
[http://dx.doi.org/10.1016/j.knosys.2017.10.028]
[17]
M. Dash, and H. Liu, "Handling large unsupervised data via dimensionality reduction”, SIGMOD Research Issues in Data Mining and Knowledge Discovery (DMKD-99), Workshop, 1999.
[18]
B. Xue, Particle swarm optimisation for feature selection, PhD thesis, Victoria University of Wellington, Wellington, New Zealand, 2014.
[19]
H. Liu, and Lei Yu, "Toward integrating feature selection algorithms for classification and clustering", IEEE Trans. Knowl. Data Eng., vol. 17, no. 4, pp. 491-502, 2005.
[http://dx.doi.org/10.1109/TKDE.2005.66]
[20]
S. Alelyani, J. Tang, and H. Liu, "Feature selection for clustering: A review", in Data Clustering, vol. 29, pp. 110-121, 2013.
[21]
T.M. Cover, and J.A. Thomas, Elements of information theory., 2nd ed Wiley: New York, 2006.
[22]
F.R.K. Chung, Spectral graph theory., vol. 92. American Mathematical Society: Providence, 1997.
[23]
U. Luxburg, "A tutorial on spectral clustering", Stat. Comput., vol. 17, no. 4, pp. 395-416, 2007.
[http://dx.doi.org/10.1007/s11222-007-9033-z]
[24]
M. Dash, and H. Liu, "Dimensionality reduction of unsupervised data", Proceedings Ninth IEEE International Conference on Tools with Artificial Intelligence, 1997, pp. 532-539.
[http://dx.doi.org/10.1109/TAI.1997.632300]
[25]
R. Varshavsky, A. Gottlieb, M. Linial, and D. Horn, "Novel unsupervised feature filtering of biological data", Bioinformatics, vol. 22, no. 14, pp. e507-e513, 2006.
[http://dx.doi.org/10.1093/bioinformatics/btl214] [PMID: 16873514]
[26]
D. Devakumari, and K. Thangavel, "Unsupervised adaptive floating search feature selection based on Contribution Entropy", 2010 International Conference on Communication and Computational Intelligence (INCOCCI), 2010, pp. 623-627.
[27]
V.M. Rao, and V.N. Sastry, "Unsupervised feature ranking based on representation entropy", 1st International Conference on Recent Advances in Information Technology, RAIT-2012, 2012, pp. 421-425.
[http://dx.doi.org/10.1109/RAIT.2012.6194631]
[28]
M. Banerjee, and N.R. Pal, "Feature selection with SVD entropy: Some modification and extension", Inf. Sci., vol. 264, pp. 118-134, 2014.
[http://dx.doi.org/10.1016/j.ins.2013.12.029]
[29]
Y. Liu, S. Liang, W. Fang, Z. Zhou, R. Hu, H. Zhou, J. Hou, and Y. Wang, "A hybrid feature selection algorithm combining information gain and genetic search for intrusion detection", J. Phys. Conf. Ser., vol. 1601, no. 3, p. 032048, 2020.
[http://dx.doi.org/10.1088/1742-6596/1601/3/032048]
[30]
X. He, D. Cai, and P. Niyogi, "Laplacian score for feature selection", In: Advances in neural information processing systems, vol. 186. 2005, pp. 507-514.
[31]
Z. Zhao, and H. Liu, "Spectral feature selection for supervised and unsupervised learning",Proceedings of the 24th International Conference on Machine Learning, 2007, pp. 1151-1157.
[http://dx.doi.org/10.1145/1273496.1273641]
[32]
P. Padungweang, C. Lursinsap, and K. Sunat, "Univariate filter technique for unsupervised feature selection using a new Laplacian score based local nearest neighbors", Asia-Pacific Conference On Information Processing, vol. 2, pp. 196-200, 2009.
[http://dx.doi.org/10.1109/APCIP.2009.185]
[33]
S. Solorio-Fernández, J.F. Martínez-Trinidad, and J.A. Carrasco-Ochoa, "A new unsupervised spectral feature selection method for mixed data: A filter approach", Pattern Recognit., vol. 72, pp. 314-326, 2017.
[http://dx.doi.org/10.1016/j.patcog.2017.07.020]
[34]
Luis Talavera, "Dependency-based feature selection for clustering symbolic data", Intell. Data Anal., vol. 4, no. 1, pp. 19-28, 2000.
[http://dx.doi.org/10.3233/IDA-2000-4103]
[35]
P. Mitra, C.A. Murthy, and S.K. Pal, "Unsupervised feature selection using feature similarity", IEEE Trans. Pattern Anal. Mach. Intell., vol. 24, no. 3, pp. 301-312, 2002.
[http://dx.doi.org/10.1109/34.990133]
[36]
M. Dash, K. Choi, P. Scheuermann, and H.L.H. Liu, "Feature selection for clustering—a filter solution", 2002 Proceedings 2002 IEEE International Conference on Data Mining, 2002, pp. 115-122.
[http://dx.doi.org/10.1109/ICDM.2002.1183893]
[37]
M. Haindl, P. Somol, D. Ververidis, and C. Kotropoulos, "Feature selection based on mutual correlation", In: Progress in pattern recognition, image analysis and applications, 2006, pp. 569-577.
[http://dx.doi.org/10.1007/11892755_59]
[38]
Y. Li, B.L. Lu, and Z.F. Wu, "Hierarchical fuzzy filter method for unsupervised feature selection", J. Intell. Fuzzy Syst., vol. 18, no. 2, pp. 157-169, 2007.
[39]
C.C. Yen, L.C. Chen, and S.D. Lin, Unsupervised feature selection: Minimize information redundancy of featuresProceedings—International Conference on Technologies and Applications of Artificial Intelligence, 2010, pp. 247-254.
[http://dx.doi.org/10.1109/TAAI.2010.49]
[40]
A.J. Ferreira, and M.A.T. Figueiredo, "An unsupervised approach to feature discretization and selection", Pattern Recognit., vol. 45, no. 9, pp. 3048-3060, 2012.
[http://dx.doi.org/10.1016/j.patcog.2011.12.008]
[41]
S. Wang, W. Pedrycz, Q. Zhu, and W. Zhu, "Unsupervised feature selection via maximum projection and minimum redundancy", Knowl. Base. Syst., vol. 75, pp. 19-29, 2015.
[http://dx.doi.org/10.1016/j.knosys.2014.11.008]
[42]
J. Ghosh, D. Kumar, and R. Tripathi, Features extraction for network intrusion detection using genetic algorithm (GA)Modern approaches in machine learning and cognitive science: A walkthrough, Springer,, 2020, pp. 13-25.
[http://dx.doi.org/10.1007/978-3-030-38445-6_2]
[43]
S. Tabakhi, P. Moradi, and F. Akhlaghian, "An unsupervised feature selection algorithm based on ant colony optimization", Eng. Appl. Artif. Intell., vol. 32, pp. 112-123, 2014.
[http://dx.doi.org/10.1016/j.engappai.2014.03.007]
[44]
S. Tabakhi, A. Najafi, R. Ranjbar, and P. Moradi, "Gene selection for microarray data classification using a novel ant colony optimization", Neurocomputing, vol. 168, pp. 1024-1036, 2015.
[http://dx.doi.org/10.1016/j.neucom.2015.05.022]
[45]
S. Tabakhi, and P. Moradi, "Relevance–redundancy feature selection based on ant colony optimization", Pattern Recognit., vol. 48, no. 9, pp. 2798-2811, 2015.
[http://dx.doi.org/10.1016/j.patcog.2015.03.020]
[46]
B.Z. Dadaneh, H.Y. Markid, and A. Zakerolhosseini, "Unsupervised probabilistic feature selection using ant colony optimization", Expert Syst. Appl., vol. 53, pp. 27-42, 2016.
[http://dx.doi.org/10.1016/j.eswa.2016.01.021]
[47]
M.M. Hassan, A. Gumaei, A. Alsanad, M. Alrubaian, and G. Fortino, "A hybrid deep learning model for efficient intrusion detection in big data environment", Inf. Sci., vol. 513, pp. 386-396, 2020.
[http://dx.doi.org/10.1016/j.ins.2019.10.069]
[48]
J. Chen, X. Qi, L. Chen, F. Chen, and G. Cheng, "Quantum-inspired ant lion optimized hybrid k-means for cluster analysis and intrusion detection", Knowl. Base. Syst., vol. 203, p. 106167, 2020.
[http://dx.doi.org/10.1016/j.knosys.2020.106167]
[49]
S. Sarvari, N.F. Mohd Sani, Z. Mohd Hanapi, and M.T. Abdullah, "An efficient anomaly intrusion detection method with feature selection and evolutionary neural network", IEEE Access, vol. 8, pp. 70651-70663, 2020.
[http://dx.doi.org/10.1109/ACCESS.2020.2986217]
[50]
W. Elmasry, A. Akbulut, and A.H. Zaim, "Evolving deep learning architectures for network intrusion detection using a double PSO metaheuristic", Comput. Netw., vol. 168, p. 107042, 2020.
[http://dx.doi.org/10.1016/j.comnet.2019.107042]
[51]
S. Niijima, and Y. Okuno, "Laplacian linear discriminant analysis approach to unsupervised feature selection", IEEE/ACM Trans. Comput. Biol. Bioinformatics, vol. 6, no. 4, pp. 605-614, 2009.
[http://dx.doi.org/10.1109/TCBB.2007.70257] [PMID: 19875859]
[52]
D. Garcia-Garcia, and R. Santos-Rodriguez, "Spectral clustering and feature selection for microarray data", International Conference on Machine Learning and Applications, 2009 ICMLA ’09, 2009, pp. 425-428.
[http://dx.doi.org/10.1109/ICMLA.2009.86]
[53]
R. Liu, N. Yang, X. Ding, and L. Ma, "An unsupervised feature selection algorithm: Laplacian score combined with distance-based entropy measure", 3rd International Symposium on Intelligent Information Technology Application, IITA 2009, vol. 3. 2009, pp. 65-68.
[http://dx.doi.org/10.1109/IITA.2009.390]
[54]
D. Cai, C. Zhang, and X. He, "Unsupervised feature selection for multi-cluster data", Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2010, pp. 333-342.
[http://dx.doi.org/10.1145/1835804.1835848]
[55]
Z. Zheng, W. Lei, and L. Huan, "Efficient spectral feature selection with minimum redundancy", Twenty Fourth AAAI Conference on Artificial Intelligence, 2010, pp. 1-6.
[56]
Y. Yang, H.T. Shen, Z. Ma, Z. Huang, and X. Zhou, "L2,1-Norm regularized discriminative feature selection for unsupervised learning", IJCAI International Joint Conference on Artificial Intelligence, 2011, pp. 1589-1594.
[http://dx.doi.org/10.5591/978-1-57735-516-8/IJCAI11-267]
[57]
Z. Li, Y. Yang, J. Liu, X. Zhou, and H. Lu, Unsupervised feature selection using nonnegative spectral analysis., AAAI, 2012.
[58]
C. Hou, F. Nie, D. Yi, and Y. Wu, "Feature selection via joint embedding learning and sparse regression", IJCAI Proceedings-International Joint Conference on Artificial Intelligence, Citeseer, vol. 22. 2011, p. 1324.
[59]
Chenping Hou, Feiping Nie, Xuelong Li, Dongyun Yi, and Yi Wu, "Joint embedding learning and sparse regression: A framework for unsupervised feature selection", IEEE Trans. Cybern., vol. 44, no. 6, pp. 793-804, 2014.
[http://dx.doi.org/10.1109/TCYB.2013.2272642] [PMID: 23893760]
[60]
Z. Zhao, L. Wang, H. Liu, and J. Ye, "On similarity preserving feature selection", IEEE Trans. Knowl. Data Eng., vol. 25, no. 3, pp. 619-632, 2013.
[http://dx.doi.org/10.1109/TKDE.2011.222]
[61]
Zechao Li, Jing Liu, Yi Yang, Xiaofang Zhou, and Hanqing Lu, "Clustering-guided sparse structural learning for unsupervised feature selection", IEEE Trans. Knowl. Data Eng., vol. 26, no. 9, pp. 2138-2150, 2014.
[http://dx.doi.org/10.1109/TKDE.2013.65]
[62]
M. Qian, and C. Zhai, "Robust unsupervised feature selection", Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence, 2013pp. 1621-1627. Available from: http://dl.acm.org/citation.cfm?id=2540361
[63]
L. Shi, L. Du, and Y.D. Shen, Robust spectral learning for unsupervised feature selection.Proceedings— IEEE International Conference on Data Mining, 2015, pp. 977-982.
[http://dx.doi.org/10.1109/ICDM.2014.58]
[64]
P. Zhu, Q. Hu, C. Zhang, and W. Zuo, Coupled dictionary learning for unsupervised feature selection., AAAI, 2016, pp. 2422-2428.
[65]
F. Nie, W. Zhu, and X. Li, "Unsupervised feature selection with structured graph optimization", Proceedings of the 30th Conference on Artificial Intelligence (AAAI 2016), vol. 13. 2016, pp. 1302-1308.
[http://dx.doi.org/10.1609/aaai.v30i1.10168]
[66]
Y. Yi, W. Zhou, Y. Cao, Q. Liu, and J. Wang, "Unsupervised feature selection with graph regularized nonnegative selfrepresentation", Z. You, J. Zhou, Y. Wang, Z. Sun, S. Shan, W. Zheng, J. Feng, and Q. Zhao, Eds., 11th Chinese Conference, CCBR 2016, 2016, pp. 591-599.
Chengdu, China. [http://dx.doi.org/10.1007/978-3-319-46654-5_65]
[67]
X. Wang, X. Zhang, Z. Zeng, Q. Wu, and J. Zhang, "Unsupervised spectral feature selection with l1-norm graph", Neurocomputing, vol. 200, pp. 47-54, 2016.
[http://dx.doi.org/10.1016/j.neucom.2016.03.017]
[68]
S. Du, Y. Ma, S. Li, and Y. Ma, "Robust unsupervised feature selection via matrix factorization", Neurocomputing, vol. 241, pp. 115-127, 2017.
[http://dx.doi.org/10.1016/j.neucom.2017.02.034]
[69]
P. Zhu, W. Zuo, L. Zhang, Q. Hu, and S.C.K. Shiu, "Unsupervised feature selection by regularized self-representation", Pattern Recognit., vol. 48, no. 2, pp. 438-446, 2015.
[http://dx.doi.org/10.1016/j.patcog.2014.08.006]
[70]
P. Zhu, W. Zhu, W. Wang, W. Zuo, and Q. Hu, "Non-convex regularized self-representation for unsupervised feature selection", Image Vis. Comput., vol. 60, pp. 22-29, 2017.
[http://dx.doi.org/10.1016/j.imavis.2016.11.014]
[71]
W. Zhou, C. Wu, Y. Yi, and G. Luo, "Structure preserving non-negative feature self-representation for unsupervised feature selection", IEEE Access, vol. 5, pp. 8792-8803, 2017.
[http://dx.doi.org/10.1109/ACCESS.2017.2699741]
[72]
S. Wang, and H. Wang, "Unsupervised feature selection via low-rank approximation and structure learning", Knowl. Base. Syst., vol. 124, pp. 70-79, 2017.
[http://dx.doi.org/10.1016/j.knosys.2017.03.002]
[73]
Q. Lu, X. Li, and Y. Dong, "Structure preserving unsupervised feature selection", Neurocomputing, vol. 301, pp. 36-45, 2018.
[http://dx.doi.org/10.1016/j.neucom.2018.04.001]
[74]
C. Tang, X. Liu, M. Li, P. Wang, J. Chen, L. Wang, and W. Li, "Robust unsupervised feature selection via dual self-representation and manifold regularization", Knowl. Base. Syst., vol. 145, pp. 109-120, 2018.
[http://dx.doi.org/10.1016/j.knosys.2018.01.009]
[75]
C. Tang, X. Zhu, J. Chen, P. Wang, X. Liu, and J. Tian, "Robust graph regularized unsupervised feature selection", Expert Syst. Appl., vol. 96, pp. 64-76, 2018.
[http://dx.doi.org/10.1016/j.eswa.2017.11.053]
[76]
M. Luo, F. Nie, X. Chang, Y. Yang, A.G. Hauptmann, and Q. Zheng, "Adaptive unsupervised feature selection with structure regularization", IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 4, pp. 944-956, 2018.
[http://dx.doi.org/10.1109/TNNLS.2017.2650978] [PMID: 28141533]
[77]
Yong Shi, Jianyu Miao, Zhengyu Wang, Peng Zhang, and Lingfeng Niu, "Feature Selection With $_2,1-2$ Regularization", IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 10, pp. 4967-4982, 2018.
[http://dx.doi.org/10.1109/TNNLS.2017.2785403] [PMID: 29994757]
[78]
M. Devaney, and A. Ram, Efficient feature selection in conceptual clustering.ICML ’97 Proceedings of the Fourteenth International Conference on Machine Learning, 1997, pp. 92-97.
[79]
J.G. Dy, and C.E. Brodley, "Feature selection for unsupervised learning", J. Mach. Learn. Res., vol. 5, pp. 845-889, 2004.
[http://dx.doi.org/10.1016/j.patrec.2014.11.006]
[80]
E.R. Hruschka, and T.F. Covoes, "Feature selection for cluster analysis: An approach based on the simplified Silhouette criterion, International Conference on Intelligent Agents Web Technologies and Internet Commerce International Conference on Computational Intelligence for Modelling", Control and Automation, vol. 1, pp. 32-38, 2005.
[81]
M. Breaban, and H. Luchian, "A unifying criterion for unsupervised clustering and feature selection", Pattern Recognit., vol. 44, no. 4, pp. 854-865, 2011.
[http://dx.doi.org/10.1016/j.patcog.2010.10.006]
[82]
Y. Kim, W.N. Street, and F. Menczer, "Evolutionary model selection in unsupervised learning", Intell. Data Anal., vol. 6, no. 6, pp. 531-556, 2002.
[http://dx.doi.org/10.3233/IDA-2002-6605]
[83]
D. Dutta, P. Dutta, and J. Sil, "Simultaneous feature selection and clustering with mixed features by multi objective genetic algorithm", Int. J. Hybrid Intell. Syst., vol. 11, no. 1, pp. 41-54, 2013.
[http://dx.doi.org/10.3233/HIS-130182]
[84]
S. Hosseini, and B.M.H. Zade, "New hybrid method for attack detection using combination of evolutionary algorithms, SVM, and ANN", Comput. Netw., vol. 173, p. 107168, 2020.
[http://dx.doi.org/10.1016/j.comnet.2020.107168]
[85]
V. Roth, and T. Lange, "Feature selection in clustering problems", Adv. Neural Inf. Process. Syst., vol. 16, pp. 473-480, 2004.
[86]
M.H.C. Law, M.A.T. Figueiredo, and A.K. Jain, "Simultaneous feature selection and clustering using mixture models", IEEE Trans. Pattern Anal. Mach. Intell., vol. 26, no. 9, pp. 1154-1166, 2004.
[http://dx.doi.org/10.1109/TPAMI.2004.71] [PMID: 15742891]
[87]
Hong Zeng, and Cheung Yiu-ming, "Feature selection and kernel learning for local learning-based clustering", IEEE Trans. Pattern Anal. Mach. Intell., vol. 33, no. 8, pp. 1532-1547, 2011.
[http://dx.doi.org/10.1109/TPAMI.2010.215] [PMID: 21135434]
[88]
S. Wang, J. Tang, and H. Liu, "Embedded unsupervised feature selection", Twenty-Ninth AAAI Conference on Artificial Intelligence, p. 7, 2015.
[89]
J. Guo, Y. Guo, X. Kong, and R. He, Unsupervised feature selection with ordinal locality school of information and communication engineering., Dalian University of Technology National, Laboratory of Pattern Recognition, CASIA Center for Excellence in Brain Science and Intelligence Technology: Dalian, 2017.
[90]
J. Guo, and W. Zhu, Dependence guided unsupervised feature selection., AAAI, 2018, pp. 2232-2239.
[91]
M. Dash, and H. Liu, "Feature selection for Clustering", In: T. Terano, H. Liu, A.L.P. Chen, Eds., Knowledge discovery and data mining. Current issues and new applications., vol. 1805. 2000, pp. 110-121.
[http://dx.doi.org/10.1007/3-540-45571-X_13]
[92]
Y. Li, B.L. Lu, and Z.F. Wu, "A hybrid method of unsupervised feature selection based on ranking", 18th International Conference on Pattern Recognition (ICPR’06), 2006pp. 687-690.
Hong Kong, China Available from: http://dl.acm.org/citation.cfm?id=1172253 [http://dx.doi.org/10.1109/ICPR.2006.84]
[93]
S. Solorio-Fernández, J. Carrasco-Ochoa, and J. Martínez-Trinidad, "A new hybrid filter wrapper feature selection method for clustering based on ranking", Neurocomput., vol. 214, pp. 866-880, 2016.
[http://dx.doi.org/10.1016/j.neucom.2016.07.026]
[94]
E.R. Hruschka, E.R. Hruschka, T.F. Covoes, and N.F.F. Ebecken, "Feature selection for clustering problems: A hybrid algorithm that iterates between k-means and a Bayesian filter", Fifth International Conference on Hybrid Intelligent Systems, 2005, HIS ’05,, 2005.
[http://dx.doi.org/10.1109/ICHIS.2005.42]
[95]
Y. Kim, and J. Gao, "Unsupervised gene selection for high dimensional data", Sixth IEEE Symposium on Bioinformatics and Bioengineering (BIBE’06), 2006, pp. 227-234.
[http://dx.doi.org/10.1109/BIBE.2006.253339]
[96]
M.A. Jashki, M. Makki, E. Bagheri, and A.A. Ghorbani, An iterative hybrid filter-wrapper approach to feature selection for document clustering. Advances in artificial intelligence., Springer: Berlin, 2009, pp. 74-85.
[http://dx.doi.org/10.1007/978-3-642-01818-3_10]
[97]
J. Hu, C. Xiong, J. Shu, X. Zhou, and J. Zhu, "An improved text clustering method based on hybrid model", Int. J. Mod. Edu. Comput. Sci., vol. 1, no. 1, pp. 35-44, 2009.
[http://dx.doi.org/10.5815/ijmecs.2009.01.05]
[98]
Y. Yang, Y. Liao, G. Meng, and J. Lee, "A hybrid feature selection scheme for unsupervised learning and its application in bearing fault diagnosis", Expert Syst. Appl., vol. 38, no. 9, pp. 11311-11320, 2011.
[http://dx.doi.org/10.1016/j.eswa.2011.02.181]
[99]
J. Yu, "A hybrid feature selection scheme and self-organizing map model for machine health assessment", Appl. Soft Comput., vol. 11, no. 5, pp. 4041-4054, 2011.
[http://dx.doi.org/10.1016/j.asoc.2011.03.026]
[100]
E.R. Hruschka, T.F. Covoes, J.E.R. Hruschka, and N.F.F. Ebecken, "Adapting supervised feature selection methods for clustering tasks,” Irma-international.org. [Online]", Available: https://www.irma-international.org/proceeding-paper/adapting-supervisedfeature-selection-methods/33031/ [Accessed:25-Nov-2022].
[101]
Y. Luo, and S. Xiong, "Clustering ensemble for unsupervised feature selection", Fourth International Conference on Fuzzy Systems and Knowledge Discovery, vol. 1, pp. 445-448, 2009.
[http://dx.doi.org/10.1109/FSKD.2009.449]
[102]
M. Dash, and Y.S. Ong, "RELIEF-C: Efficient feature selection for clustering over noisy data", 2011 23rd IEEE International Conference on Tools With Artificial Intelligence (ICTAI),, 2011, pp. 869-872.
[http://dx.doi.org/10.1109/ICTAI.2011.135]
[103]
Z. Zhao, and H. Liu, "Semi-supervised feature selection via spectral analysis", Proc. 7th SIAM Int. Conf. Data Mining, pp. 641-646, 2007.
[104]
L. Zuo, L. Li, and C. Chen, "The graph based semi-supervised algorithm with ℓ1-regularizer", Neurocomputing, vol. 149, pp. 966-974, 2015.
[http://dx.doi.org/10.1016/j.neucom.2014.07.037]
[105]
K. Zhang, L. Lan, J.T. Kwok, S. Vucetic, and B. Parvin, "Scaling up graph-based semisupervised learning via prototype vector machines", IEEE Trans. Neural Netw. Learn. Syst., vol. 26, no. 3, pp. 444-457, 2015.
[http://dx.doi.org/10.1109/TNNLS.2014.2315526] [PMID: 25720002]
[106]
N.N. Pise, and P. Kulkarni, "A survey of semi-supervised learning methods", 2008 International Conference on Computational Intelligence and Security, 2008, pp. 30-34.
[http://dx.doi.org/10.1109/CIS.2008.204]
[107]
K. Benabdeslem, and M. Hindawi, "Efficient semi-supervised feature selection: Constraint, relevance, and redundancy", IEEE Trans. Knowl. Data Eng., vol. 26, no. 5, pp. 1131-1143, 2014.
[http://dx.doi.org/10.1109/TKDE.2013.86]
[108]
O. Chapelle, B. Scholkopf, and A. Zien, Semi-supevised learning., MIT press: Cambridge, 2006.
[109]
M.A.Z. Chahooki, and N.M. Charkari, "Unsupervised manifold learning based on multiple feature spaces", Mach. Vis. Appl., vol. 25, no. 4, pp. 1053-1065, 2014.
[http://dx.doi.org/10.1007/s00138-014-0604-7]
[110]
M.A.Z. Chahooki, and N.M. Charkari, "Improvement of supervised shape retrieval by learning the manifold space", 7th Iranian Conference on Machine Vision and Image Processing, 2011, pp. 1-4.
[http://dx.doi.org/10.1109/IranianMVIP.2011.6121605]
[111]
J. Zhao, K. Lu, and X. He, "Locality sensitive semi-supervised feature selection", Neurocomputing, vol. 71, no. 10-12, pp. 1842-1849, 2008.
[http://dx.doi.org/10.1016/j.neucom.2007.06.014]
[112]
H. Cheng, W. Deng, C. Fu, Y. Wang, and Z. Qin, Graph-based semi-supervised feature selection with application to automatic spam image identification. Comput. Sci. Environ. Eng. Eco Informatics., Springer, 2011, pp. 259-264.
[http://dx.doi.org/10.1007/978-3-642-22691-5_45]
[113]
G. Doquire, and M. Verleysen, “Graph Laplacian for semisupervised feature selection in regression problems,” in Advances in Computational Intelligence, Berlin, Heidelberg: Springer Berlin Heidelberg, 2011, pp. 248-255.
[114]
G. Doquire, and M. Verleysen, "A graph Laplacian based approach to semi-supervised feature selection for regression problems", Neurocomputing, vol. 121, pp. 5-13, 2013.
[115]
M. Yang, Y. Chen, and G. Ji, "Semi_fisher score: A semi-supervised method for feature selection", Int. Conf. Mach. Learn. Cybern., pp. 527-532, 2010.
[http://dx.doi.org/10.1109/ICMLC.2010.5581007]
[116]
K. Benabdeslem, and M. Hindawi, Constrained laplacian score for semi-supervised feature selection. Mach. Learn. Knowl. Discov. Databases., Springer, 2011, pp. 204-218.
[http://dx.doi.org/10.1007/978-3-642-23780-5_23]
[117]
M. Kalakech, P. Biela, L. Macaire, and D. Hamad, "Constraint scores for semi-supervised feature selection: A comparative study", Pattern Recognit. Lett., vol. 32, no. 5, pp. 656-665, 2011.
[http://dx.doi.org/10.1016/j.patrec.2010.12.014]
[118]
L. Chen, R. Huang, and W. Huang, "Graph-based semi-supervised weighted -band selection for classification of hyperspectral data", 2010 International Conference on Audio, Language and Image Processing, 2010, pp. 1123-1126.
[http://dx.doi.org/10.1109/ICALIP.2010.5685086]
[119]
L.C.L. Chen, R.H.R. Huang, and W.H.W. Huang, "Large graph processing in the cloud", In: Conference: Proceedings of the ACM SIGMOD International Conference on Management of Data, SIGMOD 2010Indianapolis, Indiana, USA, .
[http://dx.doi.org/10.1145/1807167.1807297]
[120]
Y. Liu, F. Nie, J. Wu, and L. Chen, "Semi-supervised feature selection based on label propagation and subset selection", Comput. Inf. Appl. (ICCIA), 2010 Int. Conf., IEEE, pp. 293-296, 2010.
[http://dx.doi.org/10.1109/ICCIA.2010.6141595]
[121]
W. Yang, C. Hou, and Y. Wu, "A semi-supervised method for feature selection", 2011 International Conference on Computational and Information Sciences, 2011, pp. 329-332.
[http://dx.doi.org/10.1109/ICCIS.2011.54]
[122]
S. Lv, H. Jiang, L. Zhao, D. Wang, and M. Fan, "Manifold based fisher method for semi-supervised feature selection", 2013 10th International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), 2013, pp. 664-668.
[123]
Y. Liu, F. Nie, J. Wu, and L. Chen, "Efficient semi-supervised feature selection with noise insensitive trace ratio criterion", Neurocomputing, vol. 105, pp. 12-18, 2013.
[http://dx.doi.org/10.1016/j.neucom.2012.05.031]
[124]
Yahong Han, Yi Yang, Yan Yan, Zhigang Ma, N. Sebe, and Xiaofang Zhou, "Semisupervised feature selection via spline regression for video semantic recognition", IEEE Trans. Neural Netw. Learn. Syst., vol. 26, no. 2, pp. 252-264, 2015.
[http://dx.doi.org/10.1109/TNNLS.2014.2314123] [PMID: 25608288]
[125]
J. Li, "Semi-supervised feature selection under logistic I-RELIEF framework", Conf. Pattern Recognit, pp. 2008, 1-4.
[http://dx.doi.org/10.1109/ICPR.2008.4761687]
[126]
Yongkoo Han, Kisung Park, and Young-Koo Lee, "Confident wrapper-type semi-supervised feature selection using an ensemble classifier", 2011 2nd International Conference on Artificial Intelligence, Management Science and Electronic Commerce (AIMSEC), 2011, pp. 4581-4586.
[http://dx.doi.org/10.1109/AIMSEC.2011.6010202]
[127]
F. Bellal, H. Elghazel, and A. Aussem, "A semi-supervised feature ranking method with ensemble learning", Pattern Recognit. Lett., vol. 33, no. 10, pp. 1426-1433, 2012.
[http://dx.doi.org/10.1016/j.patrec.2012.03.001]
[128]
Z. Ma, Y. Yang, F. Nie, J. Uijlings, and N. Sebe, "Exploiting the entire feature space with sparsity for automatic image annotation", Proceedings of the 19th ACM international conference on Multimedia - MM ’11, 2011, p. 283.
[http://dx.doi.org/10.1145/2072298.2072336]
[129]
Z. Ma, F. Nie, Y. Yang, J.R.R. Uijlings, N. Sebe, and A.G. Hauptmann, "Discriminating joint feature analysis for multimedia data understanding", IEEE Trans. Multimed., vol. 14, no. 6, pp. 1662-1672, 2012.
[http://dx.doi.org/10.1109/TMM.2012.2199293]
[130]
C. Shi, Q. Ruan, and G. An, "Sparse feature selection based on graph Laplacian for web image annotation", Image Vis. Comput., vol. 32, no. 3, pp. 189-201, 2014.
[http://dx.doi.org/10.1016/j.imavis.2013.12.013]
[131]
X. Song, J. Zhang, Y. Han, and J. Jiang, "Semi-supervised feature selection via hierarchical regression for web image classification", Multimed. Syst., vol. 22, no. 1, 2014.
[http://dx.doi.org/10.1007/s00530-014-0390-0]
[132]
L. Yang, and L. Wang, "Simultaneous feature selection and classification via semi-supervised models", Nat. Comput. 2007. ICNC 2007. Third Int. Conf.,, pp. 646-650, 2007.
[http://dx.doi.org/10.1109/ICNC.2007.666]
[133]
Zenglin Xu, I. King, M.R.T. Lyu, and Rong Jin, "Discriminative semi-supervised feature selection via manifold regularization", IEEE Trans. Neural Netw., vol. 21, no. 7, pp. 1033-1047, 2010.
[http://dx.doi.org/10.1109/TNN.2010.2047114] [PMID: 20570772]
[134]
K. Dai, H.Y. Yu, and Q. Li, "A semisupervised feature selection with support vector machine", J. Appl. Math., vol. 2013, pp. 1-11, 2013.
[http://dx.doi.org/10.1155/2013/416320]
[135]
J.C. Ang, "Semi-supervised SVM-based feature felection for cancer classification using microarray gene expression data", Curr.Approaches Appl. Artif. Intell., vol. 9101, pp. 468-477, 2015.
[http://dx.doi.org/10.1007/978-3-319-19066-2]

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy