Binary grey wolf optimizer with a novel population adaptation strategy for feature selection

Feature selection is a fundamental pre-processing step in machine learning that aims to reduce the dimensionality of a dataset by selecting the most effective features from the original features. This process is regarded as a combinatorial optimization problem, and the grey wolf optimizer (GWO), a n...

Full description

Bibliographic Details
Main Authors: Huang, M. (Author), Ji, Y. (Author), Wang, D. (Author), Wang, H. (Author)
Format: Article
Language:English
Published: John Wiley and Sons Inc 2023
Subjects:
Online Access:View Fulltext in Publisher
View in Scopus
LEADER 02950nam a2200409Ia 4500
001 10.1049-cth2.12498
008 230529s2023 CNT 000 0 und d
020 |a 17518644 (ISSN) 
245 1 0 |a Binary grey wolf optimizer with a novel population adaptation strategy for feature selection 
260 0 |b John Wiley and Sons Inc  |c 2023 
856 |z View Fulltext in Publisher  |u https://doi.org/10.1049/cth2.12498 
856 |z View in Scopus  |u https://www.scopus.com/inward/record.uri?eid=2-s2.0-85159713212&doi=10.1049%2fcth2.12498&partnerID=40&md5=f3dd935fb1df634d87737d3a7aba853f 
520 3 |a Feature selection is a fundamental pre-processing step in machine learning that aims to reduce the dimensionality of a dataset by selecting the most effective features from the original features. This process is regarded as a combinatorial optimization problem, and the grey wolf optimizer (GWO), a novel meta-heuristic algorithm, has gained popularity in feature selection due to its fast convergence speed and easy implementation. In this paper, an improved binary GWO algorithm incorporating a novel Population Adaptation strategy called PA-BGWO is proposed. The PA-BGWO takes into account the characteristics of the feature selection problem and designs three strategies. The proposed strategy includes an adaptive individual update procedure to enhance the exploitation ability and accelerate convergence speed, a head wolf fine-tuned mechanism to exert the impact on each independent feature of the objective function, and a filter-based method ReliefF for calculating feature weights with dynamically adjusted mutation probabilities based on the ranking features to effectively escape from local optima. Experimental comparisons with several state-of-the-art feature selection methods on 15 classification problems demonstrate that the proposed approach can select a small feature subset with higher classification accuracy in most cases. © 2023 The Authors. IET Control Theory & Applications published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology. 
650 0 4 |a Adaptation strategies 
650 0 4 |a classification 
650 0 4 |a Classification (of information) 
650 0 4 |a Combinatorial optimization 
650 0 4 |a Combinatorial optimization problems 
650 0 4 |a dynamic mutation 
650 0 4 |a Dynamic mutation 
650 0 4 |a feature selection 
650 0 4 |a Feature Selection 
650 0 4 |a Features selection 
650 0 4 |a Gray wolf optimizer 
650 0 4 |a Gray wolves 
650 0 4 |a grey wolf optimizer 
650 0 4 |a Heuristic algorithms 
650 0 4 |a Machine-learning 
650 0 4 |a Optimizers 
650 0 4 |a population adaptation strategy 
650 0 4 |a Population adaptation strategy 
650 0 4 |a Pre-processing step 
700 1 0 |a Huang, M.  |e author 
700 1 0 |a Ji, Y.  |e author 
700 1 0 |a Wang, D.  |e author 
700 1 0 |a Wang, H.  |e author 
773 |t IET Control Theory and Applications