Generic placeholder image

Current Medical Imaging

Editor-in-Chief

ISSN (Print): 1573-4056
ISSN (Online): 1875-6603

Research Article

Random Global and Local Optimal Search Algorithm Based Subset Generation for Diagnosis of Cancer

Author(s): Loganathan Meenachi * and Srinivasan Ramakrishnan

Volume 16, Issue 3, 2020

Page: [249 - 261] Pages: 13

DOI: 10.2174/1573405614666180720152838

Price: $65

Abstract

Background: Data mining algorithms are extensively used to classify the data, in which prediction of disease using minimal computation time plays a vital role.

Objectives: The aim of this paper is to develop the classification model from reduced features and instances.

Methods: In this paper we proposed four search algorithms for feature selection the first algorithm is Random Global Optimal (RGO) search algorithm for searching the continuous, global optimal subset of features from the random population. The second is Global and Local Optimal (GLO) search algorithm for searching the global and local optimal subset of features from population. The third one is Random Local Optimal (RLO) search algorithm for generating random, local optimal subset of features from the random population. Finally the Random Global and Optimal (RGLO) search algorithm for searching the continuous, global and local optimal subset of features from the random population. RGLO search algorithm combines the properties of first three stated algorithm. The subsets of features generated from the proposed four search algorithms are evaluated using the consistency based subset evaluation measure. Instance based learning algorithm is applied to the resulting feature dataset to reduce the instances that are redundant or irrelevant for classification. The model developed using naïve Bayesian classifier from the reduced features and instances is validated with the tenfold cross validation.

Results: Classification accuracy based on RGLO search algorithm using naïve Bayesian classifier is 94.82% for Breast, 97.4% for DLBCL, 98.83% for SRBCT and 98.89% for Leukemia datasets.

Conclusion: The RGLO search based reduced features results in the high prediction rate with less computational time when compared with the complete dataset and other proposed subset generation algorithm.

Keywords: Particle swarm optimization, tabu search, differential evolution, instance based learning, naïve bayesian classifier, cancer classification.

Graphical Abstract
[1]
Das AK, Goswami S, Chakrabarti A, Chakraborty B. A new hybrid feature selection approach using feature association map for supervised and unsupervised classification. Expert Syst Appl 2017; 88: 81-94.
[http://dx.doi.org/10.1016/j.eswa.2017.06.032]
[2]
Vivekanandan T, Sriman Narayana Iyengar NC. Optimal feature selection using a modified differential evolution algorithm and its effectiveness for prediction of heart disease. Comput Biol Med 2017; 90: 125-36.
[http://dx.doi.org/10.1016/j.compbiomed.2017.09.011] [PMID: 28987988]
[3]
Wari E, Zhu W. A survey on metaheuristics for optimization in food manufacturing industry. Appl Soft Comput 2016; 46: 328-43.
[http://dx.doi.org/10.1016/j.asoc.2016.04.034]
[4]
Javidrad F, Nazari M. A new hybrid particle swarm and simulated annealing stochastic optimization method. Appl Soft Comput 2017; 60: 634-54.
[http://dx.doi.org/10.1016/j.asoc.2017.07.023]
[5]
Lai X, Yue D, Hao J-K, Glover F. Solution-based tabu search for the maximum min-sum dispersion problem. Inf Sci 2018; 441: 79-94.
[http://dx.doi.org/10.1016/j.ins.2018.02.006]
[6]
Abualigah LM, Khader AT, Hanandeh ES. A new feature selection method to improve the document clustering using particle swarm optimization algorithm. J Comput Sci 2017; 25: 456-66.
[7]
Rajamohana SP, Umamaheswari K. Hybrid approach of improved binary particle swarm optimization and shuffled frog leaping for feature selection. Comput Electr Eng 2018; 67: 497-508.
[http://dx.doi.org/10.1016/j.compeleceng.2018.02.015]
[8]
Srisukkham W, Zhang L, Neoh SC, Todryk S, Lim CP. Intelligent leukaemia diagnosis with bare-bones PSO based feature Optimization. Appl Soft Comput 2017; 56: 405-19.
[http://dx.doi.org/10.1016/j.asoc.2017.03.024]
[9]
Xia X, Liu J, Hu Z. An improved particle swarm optimizer based on tabu detecting and local learning strategy in a shrunk search space. Appl Soft Comput 2014; 23: 76-90.
[http://dx.doi.org/10.1016/j.asoc.2014.06.012]
[10]
Aziz R, Verma CK, Srivastava N. A novel approach for dimension reduction of microarray. Comput Biol Chem 2017; 71: 161-9.
[http://dx.doi.org/10.1016/j.compbiolchem.2017.10.009] [PMID: 29096382]
[11]
Liu C, Wang W, Wang M, Fengmao LMK. An efficient instance selection algorithm to reconstruct training set for support vector machine. Knowl Base Syst 2017; 116: 58-73.
[http://dx.doi.org/10.1016/j.knosys.2016.10.031]
[12]
Tsai C-F, Chang F-Y. Combining instance selection for better missing value imputation. J Syst Softw 2016; 122: 63-71.
[http://dx.doi.org/10.1016/j.jss.2016.08.093]
[13]
Shi H, Li H, Zhang D, Cheng C, Cao X. An efficient feature generation approach based on deep learning and feature selection techniques for traffic classification. Comput Netw 2018; 132: 81-98.
[http://dx.doi.org/10.1016/j.comnet.2018.01.007]
[14]
Zhang X, Zhang Q, Chen M, Sun Y, Qin X, Li H. A two-stage feature selection and intelligent fault diagnosis method for rotating machinery using hybrid filter and wrapper method. Neurocomputing 2018; 275: 2426-39.
[http://dx.doi.org/10.1016/j.neucom.2017.11.016]
[15]
Das AK, Sengupta S, Bhattacharyya S. A group incremental feature selection for classification using rough set theory based genetic algorithm. Appl Soft Comput 2018; 65: 400-11.
[http://dx.doi.org/10.1016/j.asoc.2018.01.040]
[16]
Song Y, Liang J, Lu J, Zhao X. An efficient instance selection algorithm for k nearest neighbour regression. Neurocomputing 2017; 251: 26-34.
[http://dx.doi.org/10.1016/j.neucom.2017.04.018]
[17]
Derrac J, Cornelis C, García S, Herrera F. Enhancing evolutionary instance selection algorithms by means of fuzzy rough set based feature selection. Inf Sci 2012; 186: 73-92.
[http://dx.doi.org/10.1016/j.ins.2011.09.027]
[18]
Salem H, Attiya G, El-Fishawy N. Classification of human cancer diseases by gene expression profiles. Appl Soft Comput 2017; 50: 124-34.
[http://dx.doi.org/10.1016/j.asoc.2016.11.026]
[19]
Oreski D, Oreski S, Klicek B. Effects of dataset characteristics on the performance of feature selection techniques. Appl Soft Comput 2017; 52: 109-19.
[http://dx.doi.org/10.1016/j.asoc.2016.12.023]
[20]
Onan A˘. A fuzzy-rough nearest neighbor classifier combined with consistency-based subset evaluation and instance selection for automated diagnosis of breast cancer. Expert Syst Appl 2015; 42: 6844-52.
[http://dx.doi.org/10.1016/j.eswa.2015.05.006]
[21]
Lu H, Chen J, Yan K, Jin Q, Xue Y, Gao Z. A hybrid feature selection algorithm for gene expression data classification. Neurocomputing 2017; 256: 56-62.
[http://dx.doi.org/10.1016/j.neucom.2016.07.080]
[22]
Schneider ERFA, Krohling RA. A hybrid approach using TOPSIS, Differential Evolution, and Tabu Search to find multiple solutions of constrained non-linear integer optimization problems. Knowl Base Syst 2014; 62: 47-56.
[http://dx.doi.org/10.1016/j.knosys.2014.02.015]
[23]
Shen Q, Shi W-M, Kong W. Hybrid particle swarm optimization and tabu search approach for selecting genes for tumor classification using gene expression data. Comput Biol Chem 2008; 32(1): 52-9.
[http://dx.doi.org/10.1016/j.compbiolchem.2007.10.001] [PMID: 18093877]
[24]
Diab DM, El Hindi K. Using differential evolution for improving distance measures of nominal values. Appl Soft Comput 2018; 64: 14-34.
[http://dx.doi.org/10.1016/j.asoc.2017.12.007]
[25]
Wei J-M, Wang S-Q, Yuan X-J. Ensemble Rough Hypercuboid Approach for Classifying Cancers. IEEE Trans Knowl Data Eng 2010; 22: 381-91.
[http://dx.doi.org/10.1109/TKDE.2009.114]
[26]
Tsang ECC, Chen D, Yeung DS, Wang X-Z, Lee JWT, Attributes RUFRS. IEEE Trans Fuzzy Syst 2008; 16: 1130-41.
[http://dx.doi.org/10.1109/TFUZZ.2006.889960]

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy