An Improvement of Genetic Algorithm for Machine Learning Hyperparameter Optimization

碩士 === 國立彰化師範大學 === 機電工程學系 === 107 === Intelligent machine services as one of the cornerstones of Industry 4.0. It is capable of integrating sensory control systems, artificial intelligence and internet of things (IoT) for machine learning-based sensory data processing, advanced analysis, status id...

Full description

Bibliographic Details
Main Authors: Liang, Jyun-Shun, 梁竣順
Other Authors: Wang, Ker-Win
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/handle/w9pwy7
Description
Summary:碩士 === 國立彰化師範大學 === 機電工程學系 === 107 === Intelligent machine services as one of the cornerstones of Industry 4.0. It is capable of integrating sensory control systems, artificial intelligence and internet of things (IoT) for machine learning-based sensory data processing, advanced analysis, status identification, automatic parameter setting, and fault prognosis. However, how to develop an effective algorithm to optimize the hyperparameters of Machine Learning Models is crucial for cost-effective intelligent machine training and integrating. Thus, in this study, we develop newly refined operators for genetic algorithms to optimize hyper-parameters to improve the accuracy and the effectiveness of learning models. The model is applied to process-condition monitoring using vibration status recognition through a self-made audio spectrum extraction circuit. We design and improve five genetic operators, including two selection operators, two crossover operators, and one fission mutation operator. The fission mutation operator is inspired by the genetic mutation in cell binary fission process. All of these refined operators would contribute to the improvement of the optimization process. The various combinations of the refined operators of the genetic algorithm will be investigated and evaluated by (1) benchmark functions and (2) hyperparameter optimization effectiveness to reduce the cost of machine learning. In order to examine the performance of these operators, we recombined these operators to create 8 different kinds of genetic algorithms. Each algorithm is tested by 3 benchmark functions for 10 times. The total number of function evaluations, time cost, and required generations are recorded and compared. This study also looked at the survival rates of offspring generated by mutation and fission operators in the convergence process to confirm the contribution of the involved operators. In the aspect of hyperparameter optimization for machine learning, taking a desktop CNC machining state recognition as a practical example. The improved genetic algorithms are applied to this hyper-parameter optimization problem. One can investigate the accuracy of state recognition results to compare and to verify the effectiveness of the improvement.