Going Deeper in Spiking Neural Networks: VGG and Residual Architectures

Over the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to enable low-power event-driven neuromorphic hardware. However, their application in machine learning have largely been limited to very shallow neural network architectures for simple problems. In this...

Full description

Bibliographic Details
Main Authors: Abhronil Sengupta, Yuting Ye, Robert Wang, Chiao Liu, Kaushik Roy
Format: Article
Language:English
Published: Frontiers Media S.A. 2019-03-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fnins.2019.00095/full
id doaj-5a15cc5b61df4cccb5e32073015e4fb9
record_format Article
spelling doaj-5a15cc5b61df4cccb5e32073015e4fb92020-11-24T23:34:58ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2019-03-011310.3389/fnins.2019.00095425055Going Deeper in Spiking Neural Networks: VGG and Residual ArchitecturesAbhronil Sengupta0Yuting Ye1Robert Wang2Chiao Liu3Kaushik Roy4Department of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, United StatesFacebook Reality Labs, Facebook Research, Redmond, WA, United StatesFacebook Reality Labs, Facebook Research, Redmond, WA, United StatesFacebook Reality Labs, Facebook Research, Redmond, WA, United StatesDepartment of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, United StatesOver the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to enable low-power event-driven neuromorphic hardware. However, their application in machine learning have largely been limited to very shallow neural network architectures for simple problems. In this paper, we propose a novel algorithmic technique for generating an SNN with a deep architecture, and demonstrate its effectiveness on complex visual recognition problems such as CIFAR-10 and ImageNet. Our technique applies to both VGG and Residual network architectures, with significantly better accuracy than the state-of-the-art. Finally, we present analysis of the sparse event-driven computations to demonstrate reduced hardware overhead when operating in the spiking domain.https://www.frontiersin.org/article/10.3389/fnins.2019.00095/fullspiking neural networksevent-driven neural networkssparsityneuromorphic computingvisual recognition
collection DOAJ
language English
format Article
sources DOAJ
author Abhronil Sengupta
Yuting Ye
Robert Wang
Chiao Liu
Kaushik Roy
spellingShingle Abhronil Sengupta
Yuting Ye
Robert Wang
Chiao Liu
Kaushik Roy
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
Frontiers in Neuroscience
spiking neural networks
event-driven neural networks
sparsity
neuromorphic computing
visual recognition
author_facet Abhronil Sengupta
Yuting Ye
Robert Wang
Chiao Liu
Kaushik Roy
author_sort Abhronil Sengupta
title Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
title_short Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
title_full Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
title_fullStr Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
title_full_unstemmed Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
title_sort going deeper in spiking neural networks: vgg and residual architectures
publisher Frontiers Media S.A.
series Frontiers in Neuroscience
issn 1662-453X
publishDate 2019-03-01
description Over the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to enable low-power event-driven neuromorphic hardware. However, their application in machine learning have largely been limited to very shallow neural network architectures for simple problems. In this paper, we propose a novel algorithmic technique for generating an SNN with a deep architecture, and demonstrate its effectiveness on complex visual recognition problems such as CIFAR-10 and ImageNet. Our technique applies to both VGG and Residual network architectures, with significantly better accuracy than the state-of-the-art. Finally, we present analysis of the sparse event-driven computations to demonstrate reduced hardware overhead when operating in the spiking domain.
topic spiking neural networks
event-driven neural networks
sparsity
neuromorphic computing
visual recognition
url https://www.frontiersin.org/article/10.3389/fnins.2019.00095/full
work_keys_str_mv AT abhronilsengupta goingdeeperinspikingneuralnetworksvggandresidualarchitectures
AT yutingye goingdeeperinspikingneuralnetworksvggandresidualarchitectures
AT robertwang goingdeeperinspikingneuralnetworksvggandresidualarchitectures
AT chiaoliu goingdeeperinspikingneuralnetworksvggandresidualarchitectures
AT kaushikroy goingdeeperinspikingneuralnetworksvggandresidualarchitectures
_version_ 1725526778475184128