|
|
|
|
LEADER |
01842nam a2200205Ia 4500 |
001 |
10.3390-electronics11142114 |
008 |
220718s2022 CNT 000 0 und d |
020 |
|
|
|a 20799292 (ISSN)
|
245 |
1 |
0 |
|a Exploring the Effects of Caputo Fractional Derivative in Spiking Neural Network Training
|
260 |
|
0 |
|b MDPI
|c 2022
|
856 |
|
|
|z View Fulltext in Publisher
|u https://doi.org/10.3390/electronics11142114
|
520 |
3 |
|
|a Fractional calculus is an emerging topic in artificial neural network training, especially when using gradient-based methods. This paper brings the idea of fractional derivatives to spiking neural network training using Caputo derivative-based gradient calculation. We focus on conducting an extensive investigation of performance improvements via a case study of small-scale networks using derivative orders in the unit interval. With particle swarm optimization we provide an example of handling the derivative order as an optimizable hyperparameter to find viable values for it. Using multiple benchmark datasets we empirically show that there is no single generally optimal derivative order, rather this value is data-dependent. However, statistics show that a range of derivative orders can be determined where the Caputo derivative outperforms first-order gradient descent with high confidence. Improvements in convergence speed and training time are also examined and explained by the reformulation of the Caputo derivative-based training as an adaptive weight normalization technique. © 2022 by the authors. Licensee MDPI, Basel, Switzerland.
|
650 |
0 |
4 |
|a caputo derivative
|
650 |
0 |
4 |
|a particle swarm optimization
|
650 |
0 |
4 |
|a spiking neural networks
|
650 |
0 |
4 |
|a tempotron
|
700 |
1 |
|
|a Botzheim, J.
|e author
|
700 |
1 |
|
|a Erős, G.
|e author
|
700 |
1 |
|
|a Gyöngyössy, N.M.
|e author
|
773 |
|
|
|t Electronics (Switzerland)
|