Voting Scheme Nearest Neighbors by Difference Distance Metrics Measurement

Gede Angga Pradipta, Made Liandana, Putu Desiana Wulaning Ayu, Dandy Pramana Hostiadi, Putu Sumardika Eka Putra

Abstract


K-Nearest Neighbor (KNN) is a widely used method for both classification and regression cases. This algorithm, known for its simplicity and effectiveness, relies primarily on the Euclidean formula for distance metrics. Therefore, this study aimed to develop a voting model where observations were made using different distance calculation formulas. The nearest neighbors algorithm was divided based on differences in distance measurements, with each resulting model contributing a vote to determine the final class. Consequently, three methods were proposed, namely k-nearest neighbors (KNN), Local Mean-based KNN, and Distance-Weighted neighbor (DWKNN), with an inclusion of a voting scheme. The robustness of these models was tested using umbilical cord data characterized by imbalance and small dataset size. The results showed that the proposed voting model for nearest neighbors consistently improved performance by an average of 1-2% across accuracy, precision, recall, and F1 score when compared to the conventional non-voting method.


Keywords


KNN, Euclidean, Manhattan, Minowski, Voting

References


[1] S. Suyanto, P. E. Yunanto, T. Wahyuningrum, and S. Khomsah, “A multi-voter multi-commission nearest neighbor classifier,” Journal of King Saud University - Computer and Information Sciences, vol. 34, no. 8, pp. 6292–6302, Sep. 2022, doi: 10.1016/j.jksuci.2022.01.018.

[2] J. Gou, W. Qiu, Z. Yi, X. Shen, Y. Zhan, and W. Ou, “Locality constrained representation-based K-nearest neighbor classification,” Knowl Based Syst, vol. 167, pp. 38–52, Mar. 2019, doi: 10.1016/j.knosys.2019.01.016.

[3] J. Gou, Y. Zhan, Y. Rao, X. Shen, X. Wang, and W. He, “Improved pseudo nearest neighbor classification,” Knowl Based Syst, vol. 70, pp. 361–375, Nov. 2014, doi: 10.1016/j.knosys.2014.07.020.

[4] Y. Li and S. Ercisli, “Data-efficient crop pest recognition based on KNN distance entropy,” Sustainable Computing: Informatics and Systems, vol. 38, Apr. 2023, doi: 10.1016/j.suscom.2023.100860.

[5] S. Klikovits, C. Ho Thanh, A. Cetinkaya, and P. Arcaini, “Trust your neighbours: Handling noise in multi-objective optimisation using kNN-averaging[Formula presented],” Appl Soft Comput, vol. 146, Oct. 2023, doi: 10.1016/j.asoc.2023.110631.

[6] G. Bhattacharya, K. Ghosh, and A. S. Chowdhury, “An affinity-based new local distance function and similarity measure for kNN algorithm,” Pattern Recognit Lett, vol. 33, no. 3, pp. 356–363, Feb. 2012, doi: 10.1016/j.patrec.2011.10.021.

[7] L. Niu et al., “Residual Vector Product Quantization for approximate nearest neighbor search,” Expert Syst Appl, vol. 232, p. 120832, Dec. 2023, doi: 10.1016/j.eswa.2023.120832.

[8] Z. Zhu, Z. Wang, D. Li, and W. Du, “NearCount: Selecting critical instances based on the cited counts of nearest neighbors ✩,” vol. 190, p. 105196, 2020, doi: 10.1016/j.knosys.

[9] J. Fan, Z. Pan, L. Wang, and Y. Wang, “Codebook-softened product quantization for high accuracy approximate nearest neighbor search,” Neurocomputing, vol. 507, pp. 107–116, Oct. 2022, doi: 10.1016/j.neucom.2022.08.002.

[10] Z. Pan, Y. Wang, and W. Ku, “A new k-harmonic nearest neighbor classifier based on the multi-local means,” Expert Syst Appl, vol. 67, pp. 115–125, Jan. 2017, doi: 10.1016/j.eswa.2016.09.031.

[11] Y. Mitani and Y. Hamamoto, “A local mean-based nonparametric classifier,” Pattern Recognit Lett, vol. 27, no. 10, pp. 1151–1159, Jul. 2006, doi: 10.1016/j.patrec.2005.12.016.

[12] J. Gou, Y. Zhan, Y. Rao, X. Shen, X. Wang, and W. He, “Improved pseudo nearest neighbor classification,” Knowl Based Syst, vol. 70, pp. 361–375, Nov. 2014, doi: 10.1016/j.knosys.2014.07.020.

[13] J. Gou, W. Qiu, Q. Mao, Y. Zhan, X. Shen, and Y. Rao, “A multi-local means based nearest neighbor classifier,” in Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI, IEEE Computer Society, Jun. 2018, pp. 448–452. doi: 10.1109/ICTAI.2017.00075.

[14] Y. Zeng, Y. Yang, and L. Zhao, “Pseudo nearest neighbor rule for pattern classification,” Expert Syst Appl, vol. 36, no. 2 PART 2, pp. 3587–3595, 2009, doi: 10.1016/j.eswa.2008.02.003.

[15] J. Gou, W. Qiu, Z. Yi, X. Shen, Y. Zhan, and W. Ou, “Locality constrained representation-based K-nearest neighbor classification,” Knowl Based Syst, vol. 167, pp. 38–52, Mar. 2019, doi: 10.1016/j.knosys.2019.01.016.

[16] T. Xiong, J. Gou, L. Du, and Y. Zhang, “A New Distance-weighted k-nearest Neighbor Classifier A New Nearest Centroid Neighbor Classifier Based on K Local Means Using Harmonic Mean Distance View project pattern recognition View project Jianping Gou A New Distance-weighted k-nearest Neighbor Classifier,” 2011. [Online]. Available: http://www.joics.com

[17] J. S. Sanchez, F. Pla, and F. J. Ferri, “On the use of neighbourhood-based non-parametric classifiers 1,” 1997.

[18] J. Gou, Z. Yi, L. Du, and T. Xiong, “A local mean-based k-nearest centroid neighbor classifier,” Computer Journal, vol. 55, no. 9, pp. 1058–1071, Sep. 2012, doi: 10.1093/comjnl/bxr131.

[19] Y. Lin, J. Li, M. Lin, and J. Chen, “A new nearest neighbor classifier via fusing neighborhood information,” Neurocomputing, vol. 143, pp. 164–169, Nov. 2014, doi: 10.1016/j.neucom.2014.06.009.

[20] L. Chen and G. Guo, “Nearest neighbor classification of categorical data by attributes weighting,” Expert Syst Appl, vol. 42, no. 6, pp. 3142–3149, Apr. 2015, doi: 10.1016/j.eswa.2014.12.002.

[21] P. Gader Kotturu Satyanarayana, “Landmine Detection with Ground Penetrating Radar using Fuzzy K-Nearest Neighbors Hichem Frigui,” 2004.

[22] H. Zhu and O. Basir, “An adaptive fuzzy evidential nearest neighbour formulation for classifying remote sensing images,” IEEE Transactions on Geoscience and Remote Sensing, vol. 43, no. 8, pp. 1874–1889, Aug. 2005, doi: 10.1109/TGRS.2005.848706.

[23] Maozhen. Li, Yantai da xue., Institute of Electrical and Electronics Engineers., and IEEE Circuits and Systems Society., 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery : proceedings, 10-12 August 2010, Yantai, Shandong, China. [IEEE], 2010.

[24] T. Denceux, “A k-Nearest Neighbor Classification Rule Based on Dempster-Shafer Theory,” 1995.

[25] S. Surathong, C. Maisen, and P. Piyawongwisal, “Modified Fuzzy Dempster-Shafer Theory for Decision Fusion,” in 2021 13th International Conference on Information Technology and Electrical Engineering, ICITEE 2021, Institute of Electrical and Electronics Engineers Inc., 2021, pp. 244–248. doi: 10.1109/ICITEE53064.2021.9611927.

[26] J. Derrac, S. García, and F. Herrera, “Fuzzy nearest neighbor algorithms: Taxonomy, experimental analysis and prospects,” Inf Sci (N Y), vol. 260, pp. 98–119, Mar. 2014, doi: 10.1016/j.ins.2013.10.038.

[27] S. R. Kheradpisheh, F. Behjati-Ardakani, and R. Ebrahimpour, “Combining classifiers using nearest decision prototypes,” Applied Soft Computing Journal, vol. 13, no. 12, pp. 4570–4578, 2013, doi: 10.1016/j.asoc.2013.07.028.

[28] X. Zhang, H. Xiao, R. Gao, H. Zhang, and Y. Wang, “K-nearest neighbors rule combining prototype selection and local feature weighting for classification,” Knowl Based Syst, vol. 243, May 2022, doi: 10.1016/j.knosys.2022.108451.

[29] E. Pȩkalska, R. P. W. Duin, and P. Paclík, “Prototype selection for dissimilarity-based classifiers,” Pattern Recognit, vol. 39, no. 2, pp. 189–208, Feb. 2006, doi: 10.1016/j.patcog.2005.06.012.

[30] C. W. Yen, C. N. Young, and M. L. Nagurka, “A vector quantization method for nearest neighbor classifier design,” Pattern Recognit Lett, vol. 25, no. 6, pp. 725–731, Apr. 2004, doi: 10.1016/j.patrec.2004.01.012.

[31] I. Triguero, J. Derrac, S. García, and F. Herrera, “A taxonomy and experimental study on prototype generation for nearest neighbor classification,” IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, vol. 42, no. 1, pp. 86–100, Jan. 2012, doi: 10.1109/TSMCC.2010.2103939.

[32] F. Fernández and P. Isasi, “Local feature weighting in nearest prototype classification,” IEEE Trans Neural Netw, vol. 19, no. 1, pp. 40–53, Jan. 2008, doi: 10.1109/TNN.2007.902955.

[33] S. Seo, M. Bode, and K. Obermayer, “Soft nearest prototype classification,” IEEE Trans Neural Netw, vol. 14, no. 2, pp. 390–398, Mar. 2003, doi: 10.1109/TNN.2003.809407.

[34] P. Wohlhart, M. Kostinger, M. Donoser, P. M. Roth, and H. Bischof, “Optimizing 1-nearest prototype classifiers,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2013, pp. 460–467. doi: 10.1109/CVPR.2013.66.


Full Text: PDF

DOI: 10.30595/juita.v11i2.19298

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

ISSN: 2579-8901