BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks

Binarization can greatly compress and accelerate deep convolutional neural networks (CNNs) for real-time industrial applications. However, existing binarized CNNs (BCNNs) rely on scaling factor (SF) and batch normalization (BatchNorm) that still involve resource-consuming floating-point multiplicati...

Full description

Bibliographic Details
Main Authors: Lijun Wu, Peiqing Jiang, Zhicong Chen, Xu Lin, Yunfeng Lai, Peijie Lin, Shuying Cheng
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8856200/
Description
Summary:Binarization can greatly compress and accelerate deep convolutional neural networks (CNNs) for real-time industrial applications. However, existing binarized CNNs (BCNNs) rely on scaling factor (SF) and batch normalization (BatchNorm) that still involve resource-consuming floating-point multiplication operations. Addressing the limitation, an improved BCNN named BitFlow-Net is proposed, which replaces floating-point operations with integer addition in middle layers. First, it is derived that the SF is only effective in back-propagation process, whereas it is counteracted by BatchNorm in inference process. Then, in model running phase, the SF and BatchNorm are fused into an integer addition, named BatchShift. Consequently, the data flow in middle layers is fully binarized during modeling running phase. To verify its potential in industrial applications with multiclass and binary classification tasks, the BitFlow-Net is built based on AlexNet and verified on two large image datasets, i.e., ImageNet and 11K Hands. Experimental results show that the BitFlow-Net can remove all floating-point operations in middle layers of BCNNs and greatly reduce the memory for both cases without affecting the accuracy. Particularly, the BitFlow-Net can achieve the accuracy comparable to that of the full-precision AlexNet network in the binary classification task.
ISSN:2169-3536