Supervised Feature Selection based on the Law of Total Variance

Authors

  • Nur Atiqah Mustapa Centre for Mathematical Sciences, Universiti Malaysia Pahang Al-Sultan Abdullah, Lebuhraya Persiaran Tun Khalil Yaakob, 26300 Gambang, Kuantan, Pahang, Malaysia.
  • Azlyna Senawi Centre for Mathematical Sciences, Universiti Malaysia Pahang Al-Sultan Abdullah, Lebuhraya Persiaran Tun Khalil Yaakob, 26300 Gambang, Kuantan, Pahang, Malaysia.
  • Hua-Liang Wei Department of Automatic Control and Systems Engineering, University of Sheffield, Sheffield, S1 3JD, United Kingdom.

DOI:

https://doi.org/10.15282/mekatronika.v5i2.9998

Keywords:

Correlation-based measure, dimensionality reduction, feature selection, law of total variance, classification

Abstract

Feature selection is a fundamental pre-processing step in machine learning that decreases data dimensionality by removing superfluous and irrelevant features. This study proposes a supervised feature selection method based on feature relevance by employing the law of total variance (LTV). Specifically, the LTV is used to quantify the relevance of features by analysing the association between features and class label. Six classifiers were employed to evaluate the performance and reliability of the proposed method pertaining to classification accuracy. The results proved that a feature subset given by the proposed method has the capability to achieve comparable classification accuracy to the full feature set when just half or less than half of the original features are retained. The proposed method was also proven to be versatile as it can achieves adequate classification accuracy with all six classifiers with different learning schemes. In addition, a comparison with a similar type of feature selection method (AmRMR) shows that the proposed method yields a more accurate classification.

Downloads

Published

2023-12-28

How to Cite

[1]
N. A. Mustapa, A. Senawi, and H.-L. Wei, “Supervised Feature Selection based on the Law of Total Variance”, MEKATRONIKA, vol. 5, no. 2, pp. 100–110, Dec. 2023.

Issue

Section

Original Article