The Cramming, Softening and Integrating Learning Algorithm with ReLU activation function for Binary Input/Output Problems

碩士 === 國立政治大學 === 資訊管理學系 === 107 === Rare Artificial Neural Networks studies address simultaneously the challenges of (1) systematically adjusting the amount of used hidden layer nodes within the learning process, (2) adopting ReLU activation function instead of tanh function for fast learning, and...

Full description

Bibliographic Details
Main Authors: Tsai, Yu-Han, 蔡羽涵
Other Authors: Tsaih, Rua-Huan
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/handle/ymzgt7