An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features
Feature selection has devoted a consistently great amount of effort to dimension reduction for various machine learning tasks. Existing feature selection models focus on selecting the most discriminative features for learning targets. However, this strategy is weak in handling two kinds of features,...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-11-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/10/22/8093 |
id |
doaj-5d251b1ab71248ecb1006c27b811dc38 |
---|---|
record_format |
Article |
spelling |
doaj-5d251b1ab71248ecb1006c27b811dc382020-11-25T04:01:05ZengMDPI AGApplied Sciences2076-34172020-11-01108093809310.3390/app10228093An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy FeaturesJun Wang0Yuanyuan Xu1Hengpeng Xu2Zhe Sun3Zhenglu Yang4Jinmao Wei5College of Mathematics and Statistics Science, Ludong University, Yantai 264025, ChinaCollege of Computer Science, Nankai University, Tianjin 300071, ChinaTianjin Key Laboratory of Wireless Mobile Communications and Power Transmission, College of Electronic and Communication Engineering, Tianjin Normal University, Tianjin 300387, ChinaRIKEN National Science Institute, Wako, Saitama 351-0198, JapanCollege of Computer Science, Nankai University, Tianjin 300071, ChinaCollege of Computer Science, Nankai University, Tianjin 300071, ChinaFeature selection has devoted a consistently great amount of effort to dimension reduction for various machine learning tasks. Existing feature selection models focus on selecting the most discriminative features for learning targets. However, this strategy is weak in handling two kinds of features, that is, the irrelevant and redundant ones, which are collectively referred to as noisy features. These features may hamper the construction of optimal low-dimensional subspaces and compromise the learning performance of downstream tasks. In this study, we propose a novel multi-label feature selection approach by embedding label correlations (dubbed ELC) to address these issues. Particularly, we extract label correlations for reliable label space structures and employ them to steer feature selection. In this way, label and feature spaces can be expected to be consistent and noisy features can be effectively eliminated. An extensive experimental evaluation on public benchmarks validated the superiority of ELC.https://www.mdpi.com/2076-3417/10/22/8093feature selectionnoise eliminationspace consistencylabel correlations |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Jun Wang Yuanyuan Xu Hengpeng Xu Zhe Sun Zhenglu Yang Jinmao Wei |
spellingShingle |
Jun Wang Yuanyuan Xu Hengpeng Xu Zhe Sun Zhenglu Yang Jinmao Wei An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features Applied Sciences feature selection noise elimination space consistency label correlations |
author_facet |
Jun Wang Yuanyuan Xu Hengpeng Xu Zhe Sun Zhenglu Yang Jinmao Wei |
author_sort |
Jun Wang |
title |
An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features |
title_short |
An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features |
title_full |
An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features |
title_fullStr |
An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features |
title_full_unstemmed |
An Effective Multi-Label Feature Selection Model Towards Eliminating Noisy Features |
title_sort |
effective multi-label feature selection model towards eliminating noisy features |
publisher |
MDPI AG |
series |
Applied Sciences |
issn |
2076-3417 |
publishDate |
2020-11-01 |
description |
Feature selection has devoted a consistently great amount of effort to dimension reduction for various machine learning tasks. Existing feature selection models focus on selecting the most discriminative features for learning targets. However, this strategy is weak in handling two kinds of features, that is, the irrelevant and redundant ones, which are collectively referred to as noisy features. These features may hamper the construction of optimal low-dimensional subspaces and compromise the learning performance of downstream tasks. In this study, we propose a novel multi-label feature selection approach by embedding label correlations (dubbed ELC) to address these issues. Particularly, we extract label correlations for reliable label space structures and employ them to steer feature selection. In this way, label and feature spaces can be expected to be consistent and noisy features can be effectively eliminated. An extensive experimental evaluation on public benchmarks validated the superiority of ELC. |
topic |
feature selection noise elimination space consistency label correlations |
url |
https://www.mdpi.com/2076-3417/10/22/8093 |
work_keys_str_mv |
AT junwang aneffectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT yuanyuanxu aneffectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT hengpengxu aneffectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT zhesun aneffectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT zhengluyang aneffectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT jinmaowei aneffectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT junwang effectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT yuanyuanxu effectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT hengpengxu effectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT zhesun effectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT zhengluyang effectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures AT jinmaowei effectivemultilabelfeatureselectionmodeltowardseliminatingnoisyfeatures |
_version_ |
1724447707814166528 |