A Feature Subset Selection Method Based On High-Dimensional Mutual Information

Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to use the entropy of the class attribute as a criteri...

Full description

Bibliographic Details
Main Authors: Chee Keong Kwoh, Yun Zheng
Format: Article
Language:English
Published: MDPI AG 2011-04-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/13/4/860/
Description
Summary:Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to use the entropy of the class attribute as a criterion to determine the appropriate subset of features when building classifiers. We prove that if the mutual information between a feature set X and the class attribute Y equals to the entropy of Y , then X is a Markov Blanket of Y . We show that in some cases, it is infeasible to approximate the high-dimensional mutual information with algebraic combinations of pairwise mutual information in any forms. In addition, the exhaustive searches of all combinations of features are prerequisite for finding the optimal feature subsets for classifying these kinds of data sets. We show that our approach outperforms existing filter feature subset selection methods for most of the 24 selected benchmark data sets.
ISSN:1099-4300