Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update

碩士 === 國立中興大學 === 資訊科學與工程學系所 === 107 === Deep learning is able to find the hidden rule of given problem, so it is often applied to several research field such as classification, prediction, image recognition, etc. To fullfill this strong function of deep learning, it adjusts weight and bias continuo...

Full description

Bibliographic Details
Main Authors: Bo-Wei Lin, 林柏維
Other Authors: Huan Chen
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394029%22.&searchmode=basic
id ndltd-TW-107NCHU5394029
record_format oai_dc
spelling ndltd-TW-107NCHU53940292019-11-30T06:09:40Z http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394029%22.&searchmode=basic Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update 以批次更新改良基於啟發式演算法的深度學習最佳化器 Bo-Wei Lin 林柏維 碩士 國立中興大學 資訊科學與工程學系所 107 Deep learning is able to find the hidden rule of given problem, so it is often applied to several research field such as classification, prediction, image recognition, etc. To fullfill this strong function of deep learning, it adjusts weight and bias continuously to train a model to simluate a function similar to the rule of original problem. Because the effect and efficiency of adjusting these parameter depends on the machanism of optimizer, the selection of optimizer determine the strategy of the simulated function. One of the optimizers used commonly is backpropagation. In addition to the original backpropagation, there is also a hybrid method by combining backpropagation and metaheuristic algorithm. Metaheuristic is a series of search algorithm to find the optimal solution in solution space. Becasue of the feature of metaheuristic, it is better than backpropagation in global search. Comparing metaheuristic to backpropagation in deep learning, metaheuristic is able to find a best solution in global search, and backpropagation perform well in local search. By combining these two methds, the hybrid metaheuristic-backpropagation is better than only metaheuristic and only backpropagation in global search and local search. However, the training time of metaheuristic-backpropagation optimizer is too long. This paper reduces training time by batch size update in regression problem. Finally, batch update get better result than original in complicated problem. Batch update enhance a little accuracy, and training time is reduced to 87.74% time cost. Huan Chen Chun-Wei Tsai 陳煥 蔡崇煒 2019 學位論文 ; thesis 66 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 國立中興大學 === 資訊科學與工程學系所 === 107 === Deep learning is able to find the hidden rule of given problem, so it is often applied to several research field such as classification, prediction, image recognition, etc. To fullfill this strong function of deep learning, it adjusts weight and bias continuously to train a model to simluate a function similar to the rule of original problem. Because the effect and efficiency of adjusting these parameter depends on the machanism of optimizer, the selection of optimizer determine the strategy of the simulated function. One of the optimizers used commonly is backpropagation. In addition to the original backpropagation, there is also a hybrid method by combining backpropagation and metaheuristic algorithm. Metaheuristic is a series of search algorithm to find the optimal solution in solution space. Becasue of the feature of metaheuristic, it is better than backpropagation in global search. Comparing metaheuristic to backpropagation in deep learning, metaheuristic is able to find a best solution in global search, and backpropagation perform well in local search. By combining these two methds, the hybrid metaheuristic-backpropagation is better than only metaheuristic and only backpropagation in global search and local search. However, the training time of metaheuristic-backpropagation optimizer is too long. This paper reduces training time by batch size update in regression problem. Finally, batch update get better result than original in complicated problem. Batch update enhance a little accuracy, and training time is reduced to 87.74% time cost.
author2 Huan Chen
author_facet Huan Chen
Bo-Wei Lin
林柏維
author Bo-Wei Lin
林柏維
spellingShingle Bo-Wei Lin
林柏維
Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update
author_sort Bo-Wei Lin
title Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update
title_short Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update
title_full Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update
title_fullStr Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update
title_full_unstemmed Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update
title_sort improvement of deep learning optimizer based on metaheuristic using batch update
publishDate 2019
url http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394029%22.&searchmode=basic
work_keys_str_mv AT boweilin improvementofdeeplearningoptimizerbasedonmetaheuristicusingbatchupdate
AT línbǎiwéi improvementofdeeplearningoptimizerbasedonmetaheuristicusingbatchupdate
AT boweilin yǐpīcìgèngxīngǎiliángjīyúqǐfāshìyǎnsuànfǎdeshēndùxuéxízuìjiāhuàqì
AT línbǎiwéi yǐpīcìgèngxīngǎiliángjīyúqǐfāshìyǎnsuànfǎdeshēndùxuéxízuìjiāhuàqì
_version_ 1719300452912201728