New preprint for #neuromorphic and #SpikingNeuralNetwork folk (with @pengfei-sun.bsky.social). arxiv.org/abs/2507.16043 Surrogate gradients are popular for training SNNs, but some worry whether they really learn complex temporal spike codes. TLDR: we tested this, and yes they can! 🧵👇 🤖🧠🧪
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.058Z
First of all, using synthetic datasets we find that it can extract information encoded in interspike intervals and patterns of coincidence. No problem, and degrades gracefully as we start disrupting the temporal information.
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.059Z
How about in more realistic datasets? We looked at the popular SHD spiking speech recognition dataset from @fzenke.bsky.social but the problem was that it has too much spike rate information. You can get around 50% accuracy using a multilayer perceptron on spike counts alone.
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.060Z
So we designed a modified version of this dataset where we choose a subset of around 200 of the 700 neurons, and then randomly select a fixed number of spikes on each trial. This keeps the spike timing meaningful, but discards all spike count information.
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.061Z
This dataset is harder but can still be solved to an accuracy of around 50% with an SNN+delays, and as you perturb spike timing or use an MLP on spike counts, it drops to chance accuracy.
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.062Z
We think this is a good new dataset for testing spike timing abilities of spike based-learning algorithms and models. We've released the code and dataset in the same format as SHD so it should be easy to start using this: github.com/neural-recko... zenodo.org/records/1615...
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.063Z
We tried one last test: testing on data where we reverse time. Accuracy drops a small amount for SNNs without delays, but a large amount for SNNs trained with delays. This better matches humans (we struggle to identify reversed speech), maybe suggesting that delay-based models might be a better fit.
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.064Z
We hope to add some more results to this before we send it to a journal for review, so please do give us your feedback! And many thanks to the first author, incredible MSc student Ziqiao Yu, and postdoc co-author @pengfei-sun.bsky.social.
— Dan Goodman (@neuralreckoning.bsky.social) 2025-07-24T17:03:42.065Z
Beyond Rate Coding: Surrogate Gradients Enable Spike Timing Learning in Spiking Neural Networks
Abstract
Links
Categories