MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile Devices

Applications of neural networks have gained significant importance in embedded mobile devices and Internet of Things (IoT) nodes. In particular, convolutional neural networks have emerged as one of the most powerful techniques in computer vision, speech recognition, and AI applications that can impr...

Full description

Bibliographic Details
Main Authors: Arjun Pal Chowdhury, Pranav Kulkarni, Mahdi Nazm Bojnordi
Format: Article
Language:English
Published: MDPI AG 2018-10-01
Series:Journal of Low Power Electronics and Applications
Subjects:
Online Access:http://www.mdpi.com/2079-9268/8/4/38
id doaj-4048cebd0cc844a389d8a76a59d467ce
record_format Article
spelling doaj-4048cebd0cc844a389d8a76a59d467ce2020-11-25T01:27:06ZengMDPI AGJournal of Low Power Electronics and Applications2079-92682018-10-01843810.3390/jlpea8040038jlpea8040038MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile DevicesArjun Pal Chowdhury0Pranav Kulkarni1Mahdi Nazm Bojnordi2School of Computing, University of Utah, Salt Lake City, UT 84112, USAElectrical & Computer Engineering, University of Utah, Salt Lake City, UT 84112, USASchool of Computing, University of Utah, Salt Lake City, UT 84112, USAApplications of neural networks have gained significant importance in embedded mobile devices and Internet of Things (IoT) nodes. In particular, convolutional neural networks have emerged as one of the most powerful techniques in computer vision, speech recognition, and AI applications that can improve the mobile user experience. However, satisfying all power and performance requirements of such low power devices is a significant challenge. Recent work has shown that binarizing a neural network can significantly improve the memory requirements of mobile devices at the cost of minor loss in accuracy. This paper proposes MB-CNN, a memristive accelerator for binary convolutional neural networks that perform XNOR convolution in-situ novel 2R memristive data blocks to improve power, performance, and memory requirements of embedded mobile devices. The proposed accelerator achieves at least 13.26 × , 5.91 × , and 3.18 × improvements in the system energy efficiency (computed by energy × delay) over the state-of-the-art software, GPU, and PIM architectures, respectively. The solution architecture which integrates CPU, GPU and MB-CNN outperforms every other configuration in terms of system energy and execution time.http://www.mdpi.com/2079-9268/8/4/38convolutional neural networksbinary convolutionsin-situ processingRRAM technologycomputer architectureembedded systems
collection DOAJ
language English
format Article
sources DOAJ
author Arjun Pal Chowdhury
Pranav Kulkarni
Mahdi Nazm Bojnordi
spellingShingle Arjun Pal Chowdhury
Pranav Kulkarni
Mahdi Nazm Bojnordi
MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile Devices
Journal of Low Power Electronics and Applications
convolutional neural networks
binary convolutions
in-situ processing
RRAM technology
computer architecture
embedded systems
author_facet Arjun Pal Chowdhury
Pranav Kulkarni
Mahdi Nazm Bojnordi
author_sort Arjun Pal Chowdhury
title MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile Devices
title_short MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile Devices
title_full MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile Devices
title_fullStr MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile Devices
title_full_unstemmed MB-CNN: Memristive Binary Convolutional Neural Networks for Embedded Mobile Devices
title_sort mb-cnn: memristive binary convolutional neural networks for embedded mobile devices
publisher MDPI AG
series Journal of Low Power Electronics and Applications
issn 2079-9268
publishDate 2018-10-01
description Applications of neural networks have gained significant importance in embedded mobile devices and Internet of Things (IoT) nodes. In particular, convolutional neural networks have emerged as one of the most powerful techniques in computer vision, speech recognition, and AI applications that can improve the mobile user experience. However, satisfying all power and performance requirements of such low power devices is a significant challenge. Recent work has shown that binarizing a neural network can significantly improve the memory requirements of mobile devices at the cost of minor loss in accuracy. This paper proposes MB-CNN, a memristive accelerator for binary convolutional neural networks that perform XNOR convolution in-situ novel 2R memristive data blocks to improve power, performance, and memory requirements of embedded mobile devices. The proposed accelerator achieves at least 13.26 × , 5.91 × , and 3.18 × improvements in the system energy efficiency (computed by energy × delay) over the state-of-the-art software, GPU, and PIM architectures, respectively. The solution architecture which integrates CPU, GPU and MB-CNN outperforms every other configuration in terms of system energy and execution time.
topic convolutional neural networks
binary convolutions
in-situ processing
RRAM technology
computer architecture
embedded systems
url http://www.mdpi.com/2079-9268/8/4/38
work_keys_str_mv AT arjunpalchowdhury mbcnnmemristivebinaryconvolutionalneuralnetworksforembeddedmobiledevices
AT pranavkulkarni mbcnnmemristivebinaryconvolutionalneuralnetworksforembeddedmobiledevices
AT mahdinazmbojnordi mbcnnmemristivebinaryconvolutionalneuralnetworksforembeddedmobiledevices
_version_ 1725107029749530624