A Review of Binarized Neural Networks

In this work, we review Binarized Neural Networks (BNNs). BNNs are deep neural networks that use binary values for activations and weights, instead of full precision values. With binary values, BNNs can execute computations using bitwise operations, which reduces execution time. Model sizes of BNNs...

Full description

Bibliographic Details
Main Authors: Taylor Simons, Dah-Jye Lee
Format: Article
Language:English
Published: MDPI AG 2019-06-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/8/6/661
id doaj-15b5f55b067f47bca02b8b96577837ae
record_format Article
spelling doaj-15b5f55b067f47bca02b8b96577837ae2020-11-25T00:25:58ZengMDPI AGElectronics2079-92922019-06-018666110.3390/electronics8060661electronics8060661A Review of Binarized Neural NetworksTaylor Simons0Dah-Jye Lee1Electrical and Computer Engineering, Brigham Young University, Provo, UT 84602, USAElectrical and Computer Engineering, Brigham Young University, Provo, UT 84602, USAIn this work, we review Binarized Neural Networks (BNNs). BNNs are deep neural networks that use binary values for activations and weights, instead of full precision values. With binary values, BNNs can execute computations using bitwise operations, which reduces execution time. Model sizes of BNNs are much smaller than their full precision counterparts. While the accuracy of a BNN model is generally less than full precision models, BNNs have been closing accuracy gap and are becoming more accurate on larger datasets like ImageNet. BNNs are also good candidates for deep learning implementations on FPGAs and ASICs due to their bitwise efficiency. We give a tutorial of the general BNN methodology and review various contributions, implementations and applications of BNNs.https://www.mdpi.com/2079-9292/8/6/661Binarized Neural NetworksDeep Neural Networksdeep learningFPGAdigital designdeep neural network compression
collection DOAJ
language English
format Article
sources DOAJ
author Taylor Simons
Dah-Jye Lee
spellingShingle Taylor Simons
Dah-Jye Lee
A Review of Binarized Neural Networks
Electronics
Binarized Neural Networks
Deep Neural Networks
deep learning
FPGA
digital design
deep neural network compression
author_facet Taylor Simons
Dah-Jye Lee
author_sort Taylor Simons
title A Review of Binarized Neural Networks
title_short A Review of Binarized Neural Networks
title_full A Review of Binarized Neural Networks
title_fullStr A Review of Binarized Neural Networks
title_full_unstemmed A Review of Binarized Neural Networks
title_sort review of binarized neural networks
publisher MDPI AG
series Electronics
issn 2079-9292
publishDate 2019-06-01
description In this work, we review Binarized Neural Networks (BNNs). BNNs are deep neural networks that use binary values for activations and weights, instead of full precision values. With binary values, BNNs can execute computations using bitwise operations, which reduces execution time. Model sizes of BNNs are much smaller than their full precision counterparts. While the accuracy of a BNN model is generally less than full precision models, BNNs have been closing accuracy gap and are becoming more accurate on larger datasets like ImageNet. BNNs are also good candidates for deep learning implementations on FPGAs and ASICs due to their bitwise efficiency. We give a tutorial of the general BNN methodology and review various contributions, implementations and applications of BNNs.
topic Binarized Neural Networks
Deep Neural Networks
deep learning
FPGA
digital design
deep neural network compression
url https://www.mdpi.com/2079-9292/8/6/661
work_keys_str_mv AT taylorsimons areviewofbinarizedneuralnetworks
AT dahjyelee areviewofbinarizedneuralnetworks
AT taylorsimons reviewofbinarizedneuralnetworks
AT dahjyelee reviewofbinarizedneuralnetworks
_version_ 1725346716731834368