Sparse spiking gradient descent
Advances in Neural Information Processing Systems
(2021) 34
Abstract
There is an increasing interest in emulating Spiking Neural Networks (SNNs) on neuromorphic
computing devices due to their low energy consumption. Recent advances have allowed training
SNNs to a point where they start to compete with traditional Artificial Neural Networks (ANNs)
in terms of accuracy, while at the same time being energy efficient when run on neuromorphic
hardware. However, the process of training SNNs is still based on dense tensor operations
originally developed for ANNs which do not leverage the spatiotemporally sparse nature of SNNs.
We present here the first sparse SNN backpropagation algorithm which achieves the same or better
accuracy as current state of the art methods while being significantly faster and more memory
efficient. We show the effectiveness of our method on real datasets of varying complexity
(Fashion-MNIST, Neuromophic-MNIST and Spiking Heidelberg Digits) achieving a speedup in the
backward pass of up to 150x, and 85% more memory efficient, without losing accuracy.
Links
Categories