A Survey of Classification Methods and Techniques for Improving Classification Performance
M. Balasaraswathi1 , A. Uthiramoorthy2
Section:Survey Paper, Product Type: Journal Paper
Volume-7 ,
Issue-8 , Page no. 233-240, Aug-2019
CrossRef-DOI: https://doi.org/10.26438/ijcse/v7i8.233240
Online published on Aug 31, 2019
Copyright © M. Balasaraswathi, A. Uthiramoorthy . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
View this paper at Google Scholar | DPI Digital Library
How to Cite this Paper
- IEEE Citation
- MLA Citation
- APA Citation
- BibTex Citation
- RIS Citation
IEEE Style Citation: M. Balasaraswathi, A. Uthiramoorthy, “A Survey of Classification Methods and Techniques for Improving Classification Performance,” International Journal of Computer Sciences and Engineering, Vol.7, Issue.8, pp.233-240, 2019.
MLA Style Citation: M. Balasaraswathi, A. Uthiramoorthy "A Survey of Classification Methods and Techniques for Improving Classification Performance." International Journal of Computer Sciences and Engineering 7.8 (2019): 233-240.
APA Style Citation: M. Balasaraswathi, A. Uthiramoorthy, (2019). A Survey of Classification Methods and Techniques for Improving Classification Performance. International Journal of Computer Sciences and Engineering, 7(8), 233-240.
BibTex Style Citation:
@article{Balasaraswathi_2019,
author = {M. Balasaraswathi, A. Uthiramoorthy},
title = {A Survey of Classification Methods and Techniques for Improving Classification Performance},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {8 2019},
volume = {7},
Issue = {8},
month = {8},
year = {2019},
issn = {2347-2693},
pages = {233-240},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=4815},
doi = {https://doi.org/10.26438/ijcse/v7i8.233240}
publisher = {IJCSE, Indore, INDIA},
}
RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v7i8.233240}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=4815
TI - A Survey of Classification Methods and Techniques for Improving Classification Performance
T2 - International Journal of Computer Sciences and Engineering
AU - M. Balasaraswathi, A. Uthiramoorthy
PY - 2019
DA - 2019/08/31
PB - IJCSE, Indore, INDIA
SP - 233-240
IS - 8
VL - 7
SN - 2347-2693
ER -
VIEWS | XML | |
377 | 371 downloads | 178 downloads |
Abstract
This paper surveys the state of the art techniques which have been reviewed to develop the overall classification meth- odology of this research work. The feature selection methods, traditional classification algorithms followed by a brief description of theoretical works on data mining are summarized. The major classification approaches and the techniques which is used for improving classification performance are analyzed. In addition, some important issues affecting classification performance are discussed. In this paper we have gone through the existing work in the area of classification which will allow us to have a fair evaluation of the progress made in the field of Classification.
Key-Words / Index Term
Machine Learning, Feature Selection methods, Classification
References
[1] Han, J. and Kamber, M., 2006. Classification and prediction. Data mining: Concepts and techniques, 347-350.
[2] Friedman, J., Hastie, T., and Tibshirani, R., 2001. The elements of statistical learning. Springer series in statistics.
[3] Liu, H., and Motoda, H., 2007. Computational methods of feature selection. CRC Press.
[4] Balasaraswathi, M., “Improved PSO based classifier for MultiClass Datasets”, (Doctoral dissertation, Avinashilingam University, 2017.
[5] Weston, J., Elisseeff, A., Schölkopf, B., and Tipping, M., 2003. Use of the zero-norm with linear models and kernel methods. Journal of machine learning research, 3, 1439-1461.
[6] Song, L., Smola, A., Gretton, A., Borgwardt, K.M., and Bedo, J., 2007. Supervised feature selection via dependence estimation. In Proceedings of the 24th international conference on Machine learning, 823-830.
[7] Dy, J.G., and Brodley, C.E., 2004. Feature selection for unsupervised learning. Journal of machine learning research,845-889.
[8] Zhao, Z. and Liu, H., 2007. Semi-supervised feature selection via spectral analysis. In Proceedings of the 2007 Society for Industrial and Applied Mathematics (SIAM) International Conference on Data Mining, 641-646.
[9] Xu, Z., King, I., Lyu, M.R.T., and Jin, R., 2010. Discriminative semi-supervised feature selection via manifold regularization. IEEE Transactions on Neural networks, 21(7), 1033-1047.
[10] John, G. H., Kohavi, R., and Pfleger, K., 1994. Irrelevant features and the subset selection problem. Proceedings of the Eleventh International Conference on Machine Learning, 121–129.
[11] Lal, T., Chapelle, O., Weston, J., and Elisseeff, A., 2006. Embedded methods. Feature extraction, 137-165.
[12] Duda, R.O., Hart, P.E., and Stork, D.G., 2012. Pattern classification.John Wiley & Sons.
[13] Gu, Q., Li, Z., and Han, J., 2012. Generalized fisher score for feature selection. arXiv preprint arXiv:1202.3725.
[14] Zhou, M., 2016. A Hybrid Feature Selection Method Based On Fisher Score and Genetic Algorithm. Journal of Mathematical Sciences: Advances and Applications, 37, 51-78.
[15] Chen, L., Man, H., and Nefian, A.V., 2005. Face recognition based on multi-class mapping of Fisher scores. Pattern Recognition, 38(6), 799-811.
[16] Liu, X., Ma, L., Song, L., Zhao, Y., Zhao, X., and Zhou, C., 2015. Recognizing common CT imaging signs of lung diseases through a new feature selection method based on Fisher criterion and genetic optimization. IEEE journal of biomedical and health informatics, 19(2), 635-647.
[17] He, X. F., Yan, S. C., Hu, Y. X., Niyogi, P., and Zhang, H. J., 2005. Face recognition using laplacian faces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(3), 328–340.
[18] Nie ,F., Xiang ,S., Jia ,Y., Zhang ,C., and Yan, S.,2008. TraceRatio Criterion for Feature Selection. In AAAI , 2, 671-676.
[19] Zhao, M., Zhang, Z., and Chow, T.W., 2011. ITR-Score algorithm: An efficient trace ratio criterion based algorithm for supervised dimensionality reduction. 2011 International Joint Conference on Neural Networks (IJCNN), 145-152.
[20] Arauzo-Azofra, A., Benitez, J.M., and Castro, J.L., 2004. A feature set measure based on relief. In Proceedings of the fifth international conference on Recent Advances in Soft Computing, 104-109.
[21] Zhao, M., Chan, R.H., Tang, P., Chow, T.W., and Wong, S.W., 2013. Trace ratio linear discriminant analysis for medical diagnosis: a case study of dementia. IEEE signal processing letters, 20(5), 431-434.
[22] Zhao, M., Zhang, Z., and Chow, T.W., 2012. Trace ratio criterion based generalized discriminative learning for semi- supervised dimensionality reduction. Pattern Recognition, 45(4), 1482-1499.
[23] Rosario, S.F., and Thangadurai, K., 2015. RELIEF: Feature Selection Approach. International Journal of Innovative Research and Development, 4(11), 218-221.
[24] Baskar, S.S., and Arockiam, L., 2013. E LAS-Relief-A Novel Feature Selection Algorithm In Data mining. Compusoft, 2(12), 392-395.
[25] Stokes, M.E., and Visweswaran, S., 2012. Application of a spatially-weighted Relief algorithm for ranking genetic predictors of disease. BioData mining, 5(1), pp.1-11.
[26] Soliman, O.S., and Rassem, A., 2012. Correlation based feature selection using quantum bio inspired estimation of distribution algorithm. In International Workshop on Multi- disciplinary Trends in Artificial Intelligence, 318-329.
[27] Way, T.W., Sahiner, B., Hadjiiski, L.M., and Chan, H.P., 2010. Effect of finite sample size on feature selection and classification: a simulation study. Medical physics, 37(2), 907- 920.
[28] Jacob, S.G., and Ramani, R.G., 2011. Discovery of knowledge patterns in clinical data through data mining algorithms: Multi- class categorization of breast tissue data. International Journal of Computer Applications (IJCA), 32(7), 46-53.
[29] Jacob, S.G., Geetha Ramani, R., and Nancy, P., 2013. Discovery of knowledge patterns in lymphographic clinical data through data mining methods and techniques. Advances in computing and information technology, 129-140.
[30] Jin, C., De-Lin, L., and Fen-Xiang, M., 2009. An improved ID3 decision tree algorithm. 4th International Conference on Computer Science & Education, 2009 (ICCSE’09), 127-130.
[31] Chen, X.J., Zhang, Z.G., and Tong, Y., 2014. An improved ID3 decision tree algorithm. In Advanced Materials Research , 962, 2842-2847.
[32] Luo, H., Chen, Y., and Zhang, W., 2010, An Improved ID3 Algorithm Based on Attribute Importance-Weighted. 2nd International Workshop on Database Technology and Applications (DBTA), 1-4.
[33] Yang, F., Jin, H., and Qi, H., 2012, Study on the application of data mining for customer groups based on the modified ID3 algorithm in the e-commerce. 2012 International Conference on Computer Science and Information Processing (CSIP), 615-619.
[34] Yuxun, L., and Niuniu, X., 2010, Improved ID3 algorithm. 3rd IEEE International Conference on Computer Science and Information Technology (ICCSIT), 465-468.
[35] Ali, M., Siarry, P., and Pant, M., 2012. An efficient differential evolution based algorithm for solving multi-objective optimization problems. European journal of operational research, 217(2), 404-416.t
[36] Youn, E., and Jeong, M.K., 2009. Class dependent feature scaling method using naive Bayes classifier for text datamining. Pattern Recognition Letters, 30(5), 477-485.
[37] Bermejo, P., Gámez, J.A., and Puerta, J.M., 2011. Improving the performance of Naive Bayes multinomial in e-mail foldering by introducing distribution-based balance of datasets. Expert Systems with Applications, 38(3), 2072-2080.
[38] Kibriya, A.M., Frank, E., Pfahringer, B., and Holmes, G., 2004, Multinomial Naive Bayes for Text Categorization Revisited. In Australian Conference on Artificial Intelligence, 3339, 488- 499.
[39] Deekshatulu, B.L., and Chandra, P., 2013. Classification of heart disease using k-nearest neighbor and genetic algorithm. Procedia Technology, 10, 85-94.
[40] Shouman, M., Turner, T., and Stocker, R., 2012. Applying k-nearest neighbour in diagnosing heart disease patients. International Journal of Information and Education Technology, 2(3), 220-223.
[41] Imandoust, S.B., and Bolandraftar, M., 2013. Application of k-Nearest Neighbor (KNN) approach for predicting economic events: Theoretical background. International Journal of Engineering Research and Applications, 3(5), 605-610.
[42] Wang, L., Khan, L., and Thuraisingham, B., 2008, An effective evidence theory based k-nearest neighbor (KNN) classification. In Proceedings of the 2008 IEEE/WIC/ ACM International Conference on Web Intelligence and Intelligent Agent Technology, 1, 797-801.
[43] Hu, L.Y., Huang, M.W., Ke, S.W., and Tsai, C.F., 2016. The distance function effect on k-nearest neighbor classification for medical datasets. SpringerPlus, 5(1304), 1-9.
[44] Qian, X., and Wang, X., 2009. A new study of DSS based on neural network and data mining. International Conference on E-Business and Information System Security, 2009 (EBISS›09),1-4.
[45] Kamrunnahar, M., and Urquidi-Macdonald, M., 2010. Prediction of corrosion behavior using neural network as a data mining tool. Corrosion Science, 52(3), 669-677.
[46] Hara, A., and Hayashi, Y., 2012. A new neural data analysis approach using ensemble neural network rule extraction. Artificial Neural Networks and Machine Learning–ICANN 2012, 515-522.
[47] Liu, Z., Li, H., and Miao, G., 2010. MapReduce-based backpropagation neural network over large scale mobile data. 2010 Sixth International Conference on Natural Computation (ICNC), 4, 1726-1730.
[48] Bhawna Sharma1, Sheetal Gandotra, Utkarsh Sharma, Rahul Thakur, Alankar Mahajan, A Comparative Analysis Of Different Machine Learning Classification Algorithms For Predicting Chronic Kidney Disease, International Journal of Computer Sciences and Engineering, E-ISSN: 2347-2693, Vol.-7, Issue-6, June 2019