A Survey of Algorithms and Analysis for Stochastic Gradient Methods

碩士 === 國立交通大學 === 應用數學系數學建模與科學計算碩士班 === 106 === In this thesis, we give a survey of the variant algorithms for stochastic gradient methods, including Stochastic Newton methods (SN), Stochastic Variance Reduced Gradient method (SVRG), and Adam. There are two main categories of the stochastic gradient...

Full description

Bibliographic Details
Main Authors: Syu, Jian-Ping, 許建平
Other Authors: Lee, Yuh-Jye
Format: Others
Language:en_US
Published: 2018
Online Access:http://ndltd.ncl.edu.tw/handle/a555t4
id ndltd-TW-106NCTU5507020
record_format oai_dc
spelling ndltd-TW-106NCTU55070202019-09-26T03:28:10Z http://ndltd.ncl.edu.tw/handle/a555t4 A Survey of Algorithms and Analysis for Stochastic Gradient Methods 隨機梯度法及其相關演算法概觀與分析 Syu, Jian-Ping 許建平 碩士 國立交通大學 應用數學系數學建模與科學計算碩士班 106 In this thesis, we give a survey of the variant algorithms for stochastic gradient methods, including Stochastic Newton methods (SN), Stochastic Variance Reduced Gradient method (SVRG), and Adam. There are two main categories of the stochastic gradient methods improvement: noise reduction and second-order information. We will discuss the advantage and disadvantage of these methods from different aspects. Conventionally, machine learning problems utilize the batch approaches to train the model because they directly optimize the empirical risk. However, the computational complexity of the batch approaches depends on the size of the entire training dataset. The huge training dataset lead to the large-scale problems. Therefore, researchers turn to the stochastic approaches. Compare to batch approaches, stochastic approaches optimize the model based on the random sampling from the training dataset. Under this rule, the computational cost of stochastic approaches is obviously cheaper than batch approaches. In contrast, we cannot guarantee the updating efficiency of stochastic approaches in every iteration. Thus, the variant algorithms are proposed for improving the shortcomings. In experiments, we implement these stochastic optimization problems to solve the binary classification problems based on the reduced support vector machine(RSVM) with the good properties strongly convexity and smoothness. Lee, Yuh-Jye 李育杰 2018 學位論文 ; thesis 37 en_US
collection NDLTD
language en_US
format Others
sources NDLTD
description 碩士 === 國立交通大學 === 應用數學系數學建模與科學計算碩士班 === 106 === In this thesis, we give a survey of the variant algorithms for stochastic gradient methods, including Stochastic Newton methods (SN), Stochastic Variance Reduced Gradient method (SVRG), and Adam. There are two main categories of the stochastic gradient methods improvement: noise reduction and second-order information. We will discuss the advantage and disadvantage of these methods from different aspects. Conventionally, machine learning problems utilize the batch approaches to train the model because they directly optimize the empirical risk. However, the computational complexity of the batch approaches depends on the size of the entire training dataset. The huge training dataset lead to the large-scale problems. Therefore, researchers turn to the stochastic approaches. Compare to batch approaches, stochastic approaches optimize the model based on the random sampling from the training dataset. Under this rule, the computational cost of stochastic approaches is obviously cheaper than batch approaches. In contrast, we cannot guarantee the updating efficiency of stochastic approaches in every iteration. Thus, the variant algorithms are proposed for improving the shortcomings. In experiments, we implement these stochastic optimization problems to solve the binary classification problems based on the reduced support vector machine(RSVM) with the good properties strongly convexity and smoothness.
author2 Lee, Yuh-Jye
author_facet Lee, Yuh-Jye
Syu, Jian-Ping
許建平
author Syu, Jian-Ping
許建平
spellingShingle Syu, Jian-Ping
許建平
A Survey of Algorithms and Analysis for Stochastic Gradient Methods
author_sort Syu, Jian-Ping
title A Survey of Algorithms and Analysis for Stochastic Gradient Methods
title_short A Survey of Algorithms and Analysis for Stochastic Gradient Methods
title_full A Survey of Algorithms and Analysis for Stochastic Gradient Methods
title_fullStr A Survey of Algorithms and Analysis for Stochastic Gradient Methods
title_full_unstemmed A Survey of Algorithms and Analysis for Stochastic Gradient Methods
title_sort survey of algorithms and analysis for stochastic gradient methods
publishDate 2018
url http://ndltd.ncl.edu.tw/handle/a555t4
work_keys_str_mv AT syujianping asurveyofalgorithmsandanalysisforstochasticgradientmethods
AT xǔjiànpíng asurveyofalgorithmsandanalysisforstochasticgradientmethods
AT syujianping suíjītīdùfǎjíqíxiāngguānyǎnsuànfǎgàiguānyǔfēnxī
AT xǔjiànpíng suíjītīdùfǎjíqíxiāngguānyǎnsuànfǎgàiguānyǔfēnxī
AT syujianping surveyofalgorithmsandanalysisforstochasticgradientmethods
AT xǔjiànpíng surveyofalgorithmsandanalysisforstochasticgradientmethods
_version_ 1719257721336758272