An Effective Optimization Method for Machine Learning Based on ADAM
A machine is taught by finding the minimum value of the cost function which is induced by learning data. Unfortunately, as the amount of learning increases, the non-liner activation function in the artificial neural network (ANN), the complexity of the artificial intelligence structures, and the cos...
Main Authors: | Dokkyun Yi, Jaehyun Ahn, Sangmin Ji |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-02-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/10/3/1073 |
Similar Items
-
An Adaptive Optimization Method Based on Learning Rate Schedule for Neural Networks
by: Dokkyun Yi, et al.
Published: (2021-01-01) -
An Enhanced Optimization Scheme Based on Gradient Descent Methods for Machine Learning
by: Dokkyun Yi, et al.
Published: (2019-07-01) -
A Novel Learning Rate Schedule in Optimization for Neural Networks and It’s Convergence
by: Jieun Park, et al.
Published: (2020-04-01) -
Adam and the Ants: On the Influence of the Optimization Algorithm on the Detectability of DNN Watermarks
by: Betty Cortiñas-Lorenzo, et al.
Published: (2020-12-01) -
HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
by: Kyung-Soo Kim, et al.
Published: (2021-06-01)