Event-based backpropagation can compute exact gradients for spiking neural networks
Abstract Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking netw...
Main Authors: | Timo C. Wunderlich, Christian Pehle |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Publishing Group
2021-06-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-021-91786-z |
Similar Items
-
Training Deep Spiking Neural Networks using Backpropagation
by: Jun Haeng Lee, et al.
Published: (2016-11-01) -
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures
by: Chankyu Lee, et al.
Published: (2020-02-01) -
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks
by: Yujie Wu, et al.
Published: (2018-05-01) -
On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices
by: Dongseok Kwon, et al.
Published: (2020-07-01) -
Training a digital model of a deep spiking neural network using backpropagation
by: Bondarev V
Published: (2020-01-01)