On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights

In computational neuroscience, synaptic plasticity learning rules are typically studied using the full 64-bit floating point precision computers provide. However, for dedicated hardware implementations, the precision used not only penalizes directly the required memory resources, but also the comput...

Full description

Bibliographic Details
Main Authors: Amirreza Yousefzadeh, Evangelos Stromatias, Miguel Soto, Teresa Serrano-Gotarredona, Bernabé Linares-Barranco
Format: Article
Language:English
Published: Frontiers Media S.A. 2018-10-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fnins.2018.00665/full
id doaj-8479b8313ba34374bf75ce4c97979f2b
record_format Article
spelling doaj-8479b8313ba34374bf75ce4c97979f2b2020-11-24T21:15:20ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2018-10-011210.3389/fnins.2018.00665338032On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic WeightsAmirreza YousefzadehEvangelos StromatiasMiguel SotoTeresa Serrano-GotarredonaBernabé Linares-BarrancoIn computational neuroscience, synaptic plasticity learning rules are typically studied using the full 64-bit floating point precision computers provide. However, for dedicated hardware implementations, the precision used not only penalizes directly the required memory resources, but also the computing, communication, and energy resources. When it comes to hardware engineering, a key question is always to find the minimum number of necessary bits to keep the neurocomputational system working satisfactorily. Here we present some techniques and results obtained when limiting synaptic weights to 1-bit precision, applied to a Spike-Timing-Dependent-Plasticity (STDP) learning rule in Spiking Neural Networks (SNN). We first illustrate the 1-bit synapses STDP operation by replicating a classical biological experiment on visual orientation tuning, using a simple four neuron setup. After this, we apply 1-bit STDP learning to the hidden feature extraction layer of a 2-layer system, where for the second (and output) layer we use already reported SNN classifiers. The systems are tested on two spiking datasets: a Dynamic Vision Sensor (DVS) recorded poker card symbols dataset and a Poisson-distributed spike representation MNIST dataset version. Tests are performed using the in-house MegaSim event-driven behavioral simulator and by implementing the systems on FPGA (Field Programmable Gate Array) hardware.https://www.frontiersin.org/article/10.3389/fnins.2018.00665/fullspiking neural networksspike timing dependent plasticitystochastic learningfeature extractionneuromorphic systems
collection DOAJ
language English
format Article
sources DOAJ
author Amirreza Yousefzadeh
Evangelos Stromatias
Miguel Soto
Teresa Serrano-Gotarredona
Bernabé Linares-Barranco
spellingShingle Amirreza Yousefzadeh
Evangelos Stromatias
Miguel Soto
Teresa Serrano-Gotarredona
Bernabé Linares-Barranco
On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights
Frontiers in Neuroscience
spiking neural networks
spike timing dependent plasticity
stochastic learning
feature extraction
neuromorphic systems
author_facet Amirreza Yousefzadeh
Evangelos Stromatias
Miguel Soto
Teresa Serrano-Gotarredona
Bernabé Linares-Barranco
author_sort Amirreza Yousefzadeh
title On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights
title_short On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights
title_full On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights
title_fullStr On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights
title_full_unstemmed On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights
title_sort on practical issues for stochastic stdp hardware with 1-bit synaptic weights
publisher Frontiers Media S.A.
series Frontiers in Neuroscience
issn 1662-453X
publishDate 2018-10-01
description In computational neuroscience, synaptic plasticity learning rules are typically studied using the full 64-bit floating point precision computers provide. However, for dedicated hardware implementations, the precision used not only penalizes directly the required memory resources, but also the computing, communication, and energy resources. When it comes to hardware engineering, a key question is always to find the minimum number of necessary bits to keep the neurocomputational system working satisfactorily. Here we present some techniques and results obtained when limiting synaptic weights to 1-bit precision, applied to a Spike-Timing-Dependent-Plasticity (STDP) learning rule in Spiking Neural Networks (SNN). We first illustrate the 1-bit synapses STDP operation by replicating a classical biological experiment on visual orientation tuning, using a simple four neuron setup. After this, we apply 1-bit STDP learning to the hidden feature extraction layer of a 2-layer system, where for the second (and output) layer we use already reported SNN classifiers. The systems are tested on two spiking datasets: a Dynamic Vision Sensor (DVS) recorded poker card symbols dataset and a Poisson-distributed spike representation MNIST dataset version. Tests are performed using the in-house MegaSim event-driven behavioral simulator and by implementing the systems on FPGA (Field Programmable Gate Array) hardware.
topic spiking neural networks
spike timing dependent plasticity
stochastic learning
feature extraction
neuromorphic systems
url https://www.frontiersin.org/article/10.3389/fnins.2018.00665/full
work_keys_str_mv AT amirrezayousefzadeh onpracticalissuesforstochasticstdphardwarewith1bitsynapticweights
AT evangelosstromatias onpracticalissuesforstochasticstdphardwarewith1bitsynapticweights
AT miguelsoto onpracticalissuesforstochasticstdphardwarewith1bitsynapticweights
AT teresaserranogotarredona onpracticalissuesforstochasticstdphardwarewith1bitsynapticweights
AT bernabelinaresbarranco onpracticalissuesforstochasticstdphardwarewith1bitsynapticweights
_version_ 1716745683686391808