An Improvement of Stochastic Gradient Descent Approach for Mean-Variance Portfolio Optimization Problem
In this paper, the current variant technique of the stochastic gradient descent (SGD) approach, namely, the adaptive moment estimation (Adam) approach, is improved by adding the standard error in the updating rule. The aim is to fasten the convergence rate of the Adam algorithm. This improvement is...
Main Authors: | Stephanie S. W. Su, Sie Long Kek |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2021-01-01
|
Series: | Journal of Mathematics |
Online Access: | http://dx.doi.org/10.1155/2021/8892636 |
Similar Items
-
Optimal Portfolio Selection of Mean-Variance Utility with Stochastic Interest Rate
by: Shuang Li, et al.
Published: (2020-01-01) -
International Diversification Versus Domestic Diversification: Mean-Variance Portfolio Optimization and Stochastic Dominance Approaches
by: Fathi Abid, et al.
Published: (2014-05-01) -
Stochastic gradient descent for hybrid quantum-classical optimization
by: Ryan Sweke, et al.
Published: (2020-08-01) -
Stochastic Gradient Descent inom Maskininlärning
by: L. Thunberg, Christian, et al.
Published: (2019) -
Semi-Stochastic Gradient Descent Methods
by: Jakub Konečný, et al.
Published: (2017-05-01)