Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network
碩士 === 國立中正大學 === 電機工程研究所 === 107 === Traditional circuit design to consider that operations is correctly under all conditions, leading to excessive margin because worst case design, timing sensor and adaptive scaling technology is effective to reduce voltage and excessive margin, this paper us pre-...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2019
|
Online Access: | http://ndltd.ncl.edu.tw/handle/qh6234 |
id |
ndltd-TW-107CCU00442018 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-107CCU004420182019-11-01T05:28:07Z http://ndltd.ncl.edu.tw/handle/qh6234 Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network 改善自適性電壓調控控制迴路用於高能效深層神經網路 Wang,Ren-Sian 王人賢 碩士 國立中正大學 電機工程研究所 107 Traditional circuit design to consider that operations is correctly under all conditions, leading to excessive margin because worst case design, timing sensor and adaptive scaling technology is effective to reduce voltage and excessive margin, this paper us pre-error AVS control loop system, applied to deep neuron network to implement on 28nm FPGA, using delay distribution of 10,000 handwritten MNIST images and 10mv/step , Markov chain to estimate the AVS results. Deep neural network has fault tolerant to operation error data, when the data appeared a little error, system is still correct to working, Providing more margin than the general system, but MNIST pattern characteristic that adaptive voltage scaling need a very long cycle to adjust voltage, so we propose a new improvement strategy, response time to reduce 16.857 times, and support aggressive voltage scaling. Wang,Jinn-Shyan Lin,Tay-Jyi 王進賢 林泰吉 2019 學位論文 ; thesis 36 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中正大學 === 電機工程研究所 === 107 === Traditional circuit design to consider that operations is correctly under all conditions, leading to excessive margin because worst case design, timing sensor and adaptive scaling technology is effective to reduce voltage and excessive margin, this paper us pre-error AVS control loop system, applied to deep neuron network to implement on 28nm FPGA, using delay distribution of 10,000 handwritten MNIST images and 10mv/step , Markov chain to estimate the AVS results. Deep neural network has fault tolerant to operation error data, when the data appeared a little error, system is still correct to working, Providing more margin than the general system, but MNIST pattern characteristic that adaptive voltage scaling need a very long cycle to adjust voltage, so we propose a new improvement strategy, response time to reduce 16.857 times, and support aggressive voltage scaling.
|
author2 |
Wang,Jinn-Shyan |
author_facet |
Wang,Jinn-Shyan Wang,Ren-Sian 王人賢 |
author |
Wang,Ren-Sian 王人賢 |
spellingShingle |
Wang,Ren-Sian 王人賢 Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network |
author_sort |
Wang,Ren-Sian |
title |
Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network |
title_short |
Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network |
title_full |
Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network |
title_fullStr |
Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network |
title_full_unstemmed |
Improved Adaptive voltage scaling control loop for energy-efficient Deep Neural Network |
title_sort |
improved adaptive voltage scaling control loop for energy-efficient deep neural network |
publishDate |
2019 |
url |
http://ndltd.ncl.edu.tw/handle/qh6234 |
work_keys_str_mv |
AT wangrensian improvedadaptivevoltagescalingcontrolloopforenergyefficientdeepneuralnetwork AT wángrénxián improvedadaptivevoltagescalingcontrolloopforenergyefficientdeepneuralnetwork AT wangrensian gǎishànzìshìxìngdiànyādiàokòngkòngzhìhuílùyòngyúgāonéngxiàoshēncéngshénjīngwǎnglù AT wángrénxián gǎishànzìshìxìngdiànyādiàokòngkòngzhìhuílùyòngyúgāonéngxiàoshēncéngshénjīngwǎnglù |
_version_ |
1719285107775242240 |