Summary: | 碩士 === 國立高雄應用科技大學 === 電子與資訊工程研究所碩士班 === 94 === The feature selection process can be considered a problem of global combinatorial optimization in machine learning, which reduces the number of features, removes irrelevant, noisy and redundant data, and results in acceptable classification accuracy. Feature selection is of great importance in the fields of data analysis and information retrieval processing, pattern classification, and data mining applications.
Therefore, a good feature selection method for sample classification, based on the number of features investigated, is needed to speed up the processing rate, predictive accuracy, and avoid incomprehensibility. In this paper, we propose to combine global optimization search and local optimization search for feature selection by nested binary PSO (NBPSO). NBPSO serves as a local optimizer each time and it has been run for a single generation as global optimizer. The K-nearest neighbor (K-NN) method with leave-one-out cross-validation (LOOCV) serves as an evaluator of the fitness function. Experimental results show the method simplifies features effectively and either obtains a higher classification accuracy or uses fewer features compared to other feature selection methods.
|