Event-based backpropagation can compute exact gradients for spiking neural networks

Abstract Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking netw...

Full description

Bibliographic Details
Main Authors: Timo C. Wunderlich, Christian Pehle
Format: Article
Language:English
Published: Nature Publishing Group 2021-06-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-021-91786-z
id doaj-b122f4ad7eb4420e80716f2cff95fb3f
record_format Article
spelling doaj-b122f4ad7eb4420e80716f2cff95fb3f2021-06-20T11:34:37ZengNature Publishing GroupScientific Reports2045-23222021-06-0111111710.1038/s41598-021-91786-zEvent-based backpropagation can compute exact gradients for spiking neural networksTimo C. Wunderlich0Christian Pehle1Kirchhoff-Institute for Physics, Heidelberg UniversityKirchhoff-Institute for Physics, Heidelberg UniversityAbstract Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was previously hindered by the existence of discrete spike events and discontinuities. For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function by applying the adjoint method together with the proper partial derivative jumps, allowing for backpropagation through discrete spike events without approximations. This algorithm, EventProp, backpropagates errors at spike times in order to compute the exact gradient in an event-based, temporally and spatially sparse fashion. We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance. Our work supports the rigorous study of gradient-based learning algorithms in spiking neural networks and provides insights toward their implementation in novel brain-inspired hardware.https://doi.org/10.1038/s41598-021-91786-z
collection DOAJ
language English
format Article
sources DOAJ
author Timo C. Wunderlich
Christian Pehle
spellingShingle Timo C. Wunderlich
Christian Pehle
Event-based backpropagation can compute exact gradients for spiking neural networks
Scientific Reports
author_facet Timo C. Wunderlich
Christian Pehle
author_sort Timo C. Wunderlich
title Event-based backpropagation can compute exact gradients for spiking neural networks
title_short Event-based backpropagation can compute exact gradients for spiking neural networks
title_full Event-based backpropagation can compute exact gradients for spiking neural networks
title_fullStr Event-based backpropagation can compute exact gradients for spiking neural networks
title_full_unstemmed Event-based backpropagation can compute exact gradients for spiking neural networks
title_sort event-based backpropagation can compute exact gradients for spiking neural networks
publisher Nature Publishing Group
series Scientific Reports
issn 2045-2322
publishDate 2021-06-01
description Abstract Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was previously hindered by the existence of discrete spike events and discontinuities. For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function by applying the adjoint method together with the proper partial derivative jumps, allowing for backpropagation through discrete spike events without approximations. This algorithm, EventProp, backpropagates errors at spike times in order to compute the exact gradient in an event-based, temporally and spatially sparse fashion. We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance. Our work supports the rigorous study of gradient-based learning algorithms in spiking neural networks and provides insights toward their implementation in novel brain-inspired hardware.
url https://doi.org/10.1038/s41598-021-91786-z
work_keys_str_mv AT timocwunderlich eventbasedbackpropagationcancomputeexactgradientsforspikingneuralnetworks
AT christianpehle eventbasedbackpropagationcancomputeexactgradientsforspikingneuralnetworks
_version_ 1721369905577590784