The Cramming, Softening and Integrating Learning Algorithm with ReLU activation function for Binary Input/Output Problems
碩士 === 國立政治大學 === 資訊管理學系 === 107 === Rare Artificial Neural Networks studies address simultaneously the challenges of (1) systematically adjusting the amount of used hidden layer nodes within the learning process, (2) adopting ReLU activation function instead of tanh function for fast learning, and...
Main Authors: | Tsai, Yu-Han, 蔡羽涵 |
---|---|
Other Authors: | Tsaih, Rua-Huan |
Format: | Others |
Language: | zh-TW |
Published: |
2019
|
Online Access: | http://ndltd.ncl.edu.tw/handle/ymzgt7 |
Similar Items
-
The Cramming, Softening and Integrating Learning Algorithm with ReLU Activation Function for Real Number Input / Output Problems
by: Liang, Wei-Ting, et al.
Published: (2019) -
Studying Perturbations on the Input of Two-Layer Neural Networks with ReLU Activation
by: Alsubaihi, Salman
Published: (2019) -
Learning algorithm analysis for deep neural network with ReLu activation functions
by: Płaczek Stanisław, et al.
Published: (2018-01-01) -
Gaussian Perturbations in ReLU Networks and the Arrangement of Activation Regions
by: Daróczy, B.
Published: (2022) -
Towards Fast computation of certified robustness for ReLU networks
by: Weng, Tsui-Wei, et al.
Published: (2021)