The Implementations for High Computa-tion-Efficient Recurrent Neural Networks

碩士 === 亞洲大學 === 光電與通訊學系碩士在職專班 === 106 === In this thesis, a high computational efficiency recurrent neural network (RNN) with state-space realizations is proposed. The proposed RNN is local feedback. We con-sider that the RNN is implemented in fixed-point digital devices. The high compu-tational eff...

Full description

Bibliographic Details
Main Authors: CHENG, GUAN-YING, 程冠穎
Other Authors: Ko, Hsien-Ju
Format: Others
Language:zh-TW
Published: 2018
Online Access:http://ndltd.ncl.edu.tw/handle/e55464
Description
Summary:碩士 === 亞洲大學 === 光電與通訊學系碩士在職專班 === 106 === In this thesis, a high computational efficiency recurrent neural network (RNN) with state-space realizations is proposed. The proposed RNN is local feedback. We con-sider that the RNN is implemented in fixed-point digital devices. The high compu-tational efficiency state-space is synthesized based on pole sensitivity measure minimization. In contrast to the conventional optimal structures, the proposed structure requires only 4n+1 multiplications in every sample time rather than (n+1)2 multiplications in the conventional ones, where n is the order of the state-space systems. By using back propagation learning algorithm, the proposed structure is with similar performances comparing with the conventional optimal structures, but can significantly be with lower computational burden. Finally, numerical examples are performed to illustrate the effectiveness of the proposed approach.