Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network
碩士 === 國立彰化師範大學 === 電子工程學系 === 108 === Nowadays, in the era of rapid development of artificial intelligence, neural network plays a very important key in artificial intelligence. However, artificial intelligence requires a large amount of data calculation. Therefore, in order to accelerate the compu...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2019
|
Online Access: | http://ndltd.ncl.edu.tw/handle/2vg947 |
id |
ndltd-TW-108NCUE5428001 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-108NCUE54280012019-11-06T03:33:28Z http://ndltd.ncl.edu.tw/handle/2vg947 Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network 神經網路中神經形態電阻式記憶體具預補償之類比伯格碼 Chang,Yu-Teng 張宇騰 碩士 國立彰化師範大學 電子工程學系 108 Nowadays, in the era of rapid development of artificial intelligence, neural network plays a very important key in artificial intelligence. However, artificial intelligence requires a large amount of data calculation. Therefore, in order to accelerate the computational speed of artificial intelligence, a neural type resistance is proposed. Memory is used as a method to accelerate neural network operations. However, neural type resistive memory performs memory multiplication and accumulation operations in a graphics processing unit (GPU). However, resistive memory is subject to some read, write, and transfer processes. The error caused by the error and the defects in the process, so in this paper, the error detection and pre-compensation for the neural type resistive memory are proposed in advance through the analogy of the Berger code, the neural network and the error rebound algorithm. The resistance is reversed and will be corrected by the neurological resistive memory of the error. In the literature, the Analog Code has the ability to be fault-tolerant. In this paper, the analogy code is proposed, and the Berg code, which is usually used in the asymmetric error system for error detection, is analog-coded, and then further. We extend the analogy of the Berger code into a two-dimensional architecture. The two-dimensional architecture can be used to locate a single error. The error detected by the analogy of the Berger code is first used by the proposed error rebound algorithm. After the analogy of the checksum value read by the analogy code and the codeword after the error is added, the average error of the recalculation sum is subtracted back into each cell to achieve the effect of false bounce, and then the neural network is reused. When the road itself propagates forward, the input multiplied by the weight is added as a read check word for the average gradient calculation, and when the backward spread is initialized or updated to the weight, so that the neural network itself has self-healing. Ability to reduce neural network training time and improve accuracy, first correct the code and then re-write after learning and training through the proposed neural network Going back to RRAM to make corrections, the process proposed in this paper can reduce the time of training and learning of neural networks and improve the accuracy of learning. In the paper, through the error rate simulation analysis, we simulate the analysis through the block error rate. In this paper, the errors caused by all errors are divided into local error and regional error. The local error is hypothesis. Simulating the local error, the regional error is the noise that the analog is subjected to in the channel, so we simulated the local error and the regional error individually and in combination, and according to the simulation results, it is obtained from the simulation result of the block error rate. In the AWGN environment, after the mean=0, the MTTF is about 100 times larger than the uncoded area, and the simulated asymmetric AWGN mean=0.125 is improved by more than about 150 times. When the local error occurs, the MTTF can still be used. The improvement is 10 times, which proves that the proposed method can effectively compensate for the large error rate caused by the degradation, and can also achieve the self-repair function through the healing talent of the neural network itself. Tsung-Chu Huang 黃宗柱 2019 學位論文 ; thesis 43 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立彰化師範大學 === 電子工程學系 === 108 === Nowadays, in the era of rapid development of artificial intelligence, neural network plays a very important key in artificial intelligence. However, artificial intelligence requires a large amount of data calculation. Therefore, in order to accelerate the computational speed of artificial intelligence, a neural type resistance is proposed. Memory is used as a method to accelerate neural network operations. However, neural type resistive memory performs memory multiplication and accumulation operations in a graphics processing unit (GPU). However, resistive memory is subject to some read, write, and transfer processes. The error caused by the error and the defects in the process, so in this paper, the error detection and pre-compensation for the neural type resistive memory are proposed in advance through the analogy of the Berger code, the neural network and the error rebound algorithm. The resistance is reversed and will be corrected by the neurological resistive memory of the error.
In the literature, the Analog Code has the ability to be fault-tolerant. In this paper, the analogy code is proposed, and the Berg code, which is usually used in the asymmetric error system for error detection, is analog-coded, and then further. We extend the analogy of the Berger code into a two-dimensional architecture. The two-dimensional architecture can be used to locate a single error. The error detected by the analogy of the Berger code is first used by the proposed error rebound algorithm. After the analogy of the checksum value read by the analogy code and the codeword after the error is added, the average error of the recalculation sum is subtracted back into each cell to achieve the effect of false bounce, and then the neural network is reused. When the road itself propagates forward, the input multiplied by the weight is added as a read check word for the average gradient calculation, and when the backward spread is initialized or updated to the weight, so that the neural network itself has self-healing. Ability to reduce neural network training time and improve accuracy, first correct the code and then re-write after learning and training through the proposed neural network Going back to RRAM to make corrections, the process proposed in this paper can reduce the time of training and learning of neural networks and improve the accuracy of learning.
In the paper, through the error rate simulation analysis, we simulate the analysis through the block error rate. In this paper, the errors caused by all errors are divided into local error and regional error. The local error is hypothesis. Simulating the local error, the regional error is the noise that the analog is subjected to in the channel, so we simulated the local error and the regional error individually and in combination, and according to the simulation results, it is obtained from the simulation result of the block error rate. In the AWGN environment, after the mean=0, the MTTF is about 100 times larger than the uncoded area, and the simulated asymmetric AWGN mean=0.125 is improved by more than about 150 times. When the local error occurs, the MTTF can still be used. The improvement is 10 times, which proves that the proposed method can effectively compensate for the large error rate caused by the degradation, and can also achieve the self-repair function through the healing talent of the neural network itself.
|
author2 |
Tsung-Chu Huang |
author_facet |
Tsung-Chu Huang Chang,Yu-Teng 張宇騰 |
author |
Chang,Yu-Teng 張宇騰 |
spellingShingle |
Chang,Yu-Teng 張宇騰 Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network |
author_sort |
Chang,Yu-Teng |
title |
Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network |
title_short |
Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network |
title_full |
Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network |
title_fullStr |
Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network |
title_full_unstemmed |
Precompensated Analog Berger Codes for Neuromorphic RRAM in Neural Network |
title_sort |
precompensated analog berger codes for neuromorphic rram in neural network |
publishDate |
2019 |
url |
http://ndltd.ncl.edu.tw/handle/2vg947 |
work_keys_str_mv |
AT changyuteng precompensatedanalogbergercodesforneuromorphicrraminneuralnetwork AT zhāngyǔténg precompensatedanalogbergercodesforneuromorphicrraminneuralnetwork AT changyuteng shénjīngwǎnglùzhōngshénjīngxíngtàidiànzǔshìjìyìtǐjùyùbǔchángzhīlèibǐbógémǎ AT zhāngyǔténg shénjīngwǎnglùzhōngshénjīngxíngtàidiànzǔshìjìyìtǐjùyùbǔchángzhīlèibǐbógémǎ |
_version_ |
1719287498058760192 |