An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features

Feature selection has devoted a consistently great amount of effort to dimension reduction for various machine learning tasks. Existing feature selection models focus on selecting the most discriminative features for learning targets. However, this strategy is weak in handling two kinds of features,...

Full description

Bibliographic Details
Main Authors: Jun Wang, Yuanyuan Xu, Hengpeng Xu, Zhe Sun, Zhenglu Yang, Jinmao Wei
Format: Article
Language:English
Published: MDPI AG 2020-11-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/22/8093
Description
Summary:Feature selection has devoted a consistently great amount of effort to dimension reduction for various machine learning tasks. Existing feature selection models focus on selecting the most discriminative features for learning targets. However, this strategy is weak in handling two kinds of features, that is, the irrelevant and redundant ones, which are collectively referred to as noisy features. These features may hamper the construction of optimal low-dimensional subspaces and compromise the learning performance of downstream tasks. In this study, we propose a novel multi-label feature selection approach by embedding label correlations (dubbed ELC) to address these issues. Particularly, we extract label correlations for reliable label space structures and employ them to steer feature selection. In this way, label and feature spaces can be expected to be consistent and noisy features can be effectively eliminated. An extensive experimental evaluation on public benchmarks validated the superiority of ELC.
ISSN:2076-3417