BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks
Binarization can greatly compress and accelerate deep convolutional neural networks (CNNs) for real-time industrial applications. However, existing binarized CNNs (BCNNs) rely on scaling factor (SF) and batch normalization (BatchNorm) that still involve resource-consuming floating-point multiplicati...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8856200/ |
id |
doaj-407a872fe8ac4d568548ff409f87e1cf |
---|---|
record_format |
Article |
spelling |
doaj-407a872fe8ac4d568548ff409f87e1cf2021-03-30T00:51:47ZengIEEEIEEE Access2169-35362019-01-01715461715462610.1109/ACCESS.2019.29454888856200BitFlow-Net: Toward Fully Binarized Convolutional Neural NetworksLijun Wu0Peiqing Jiang1Zhicong Chen2https://orcid.org/0000-0002-3471-6395Xu Lin3Yunfeng Lai4Peijie Lin5Shuying Cheng6College of Physics and Information Engineering, Fuzhou University, Fuzhou, ChinaCollege of Physics and Information Engineering, Fuzhou University, Fuzhou, ChinaCollege of Physics and Information Engineering, Fuzhou University, Fuzhou, ChinaCollege of Physics and Information Engineering, Fuzhou University, Fuzhou, ChinaCollege of Physics and Information Engineering, Fuzhou University, Fuzhou, ChinaCollege of Physics and Information Engineering, Fuzhou University, Fuzhou, ChinaCollege of Physics and Information Engineering, Fuzhou University, Fuzhou, ChinaBinarization can greatly compress and accelerate deep convolutional neural networks (CNNs) for real-time industrial applications. However, existing binarized CNNs (BCNNs) rely on scaling factor (SF) and batch normalization (BatchNorm) that still involve resource-consuming floating-point multiplication operations. Addressing the limitation, an improved BCNN named BitFlow-Net is proposed, which replaces floating-point operations with integer addition in middle layers. First, it is derived that the SF is only effective in back-propagation process, whereas it is counteracted by BatchNorm in inference process. Then, in model running phase, the SF and BatchNorm are fused into an integer addition, named BatchShift. Consequently, the data flow in middle layers is fully binarized during modeling running phase. To verify its potential in industrial applications with multiclass and binary classification tasks, the BitFlow-Net is built based on AlexNet and verified on two large image datasets, i.e., ImageNet and 11K Hands. Experimental results show that the BitFlow-Net can remove all floating-point operations in middle layers of BCNNs and greatly reduce the memory for both cases without affecting the accuracy. Particularly, the BitFlow-Net can achieve the accuracy comparable to that of the full-precision AlexNet network in the binary classification task.https://ieeexplore.ieee.org/document/8856200/Binarized convolutional neural networksmodel acceleration and compressionBatchShift |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Lijun Wu Peiqing Jiang Zhicong Chen Xu Lin Yunfeng Lai Peijie Lin Shuying Cheng |
spellingShingle |
Lijun Wu Peiqing Jiang Zhicong Chen Xu Lin Yunfeng Lai Peijie Lin Shuying Cheng BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks IEEE Access Binarized convolutional neural networks model acceleration and compression BatchShift |
author_facet |
Lijun Wu Peiqing Jiang Zhicong Chen Xu Lin Yunfeng Lai Peijie Lin Shuying Cheng |
author_sort |
Lijun Wu |
title |
BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks |
title_short |
BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks |
title_full |
BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks |
title_fullStr |
BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks |
title_full_unstemmed |
BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks |
title_sort |
bitflow-net: toward fully binarized convolutional neural networks |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2019-01-01 |
description |
Binarization can greatly compress and accelerate deep convolutional neural networks (CNNs) for real-time industrial applications. However, existing binarized CNNs (BCNNs) rely on scaling factor (SF) and batch normalization (BatchNorm) that still involve resource-consuming floating-point multiplication operations. Addressing the limitation, an improved BCNN named BitFlow-Net is proposed, which replaces floating-point operations with integer addition in middle layers. First, it is derived that the SF is only effective in back-propagation process, whereas it is counteracted by BatchNorm in inference process. Then, in model running phase, the SF and BatchNorm are fused into an integer addition, named BatchShift. Consequently, the data flow in middle layers is fully binarized during modeling running phase. To verify its potential in industrial applications with multiclass and binary classification tasks, the BitFlow-Net is built based on AlexNet and verified on two large image datasets, i.e., ImageNet and 11K Hands. Experimental results show that the BitFlow-Net can remove all floating-point operations in middle layers of BCNNs and greatly reduce the memory for both cases without affecting the accuracy. Particularly, the BitFlow-Net can achieve the accuracy comparable to that of the full-precision AlexNet network in the binary classification task. |
topic |
Binarized convolutional neural networks model acceleration and compression BatchShift |
url |
https://ieeexplore.ieee.org/document/8856200/ |
work_keys_str_mv |
AT lijunwu bitflownettowardfullybinarizedconvolutionalneuralnetworks AT peiqingjiang bitflownettowardfullybinarizedconvolutionalneuralnetworks AT zhicongchen bitflownettowardfullybinarizedconvolutionalneuralnetworks AT xulin bitflownettowardfullybinarizedconvolutionalneuralnetworks AT yunfenglai bitflownettowardfullybinarizedconvolutionalneuralnetworks AT peijielin bitflownettowardfullybinarizedconvolutionalneuralnetworks AT shuyingcheng bitflownettowardfullybinarizedconvolutionalneuralnetworks |
_version_ |
1724187683020865536 |