37149@AAAI

Total: 1

#1 Parallel Training Time-to-First-Spike Spiking Neural Networks [PDF] [Copy] [Kimi] [REL]

Authors: Kaiwei Che, Wei Fang, Peng Xue, Yifan Huang, Zhengyu Ma, Yonghong Tian

Spiking Neural Networks (SNNs) offer a promising energy-efficient computing paradigm owing to their event-driven properties and biologically inspired dynamics. Among various encoding schemes, Time-to-First-Spike (TTFS) is particularly notable for its extreme sparsity, utilizing a single spike per neuron to maximize energy efficiency. However, two significant challenges persist: effectively leveraging TTFS sparsity to minimize training costs on Graphics Processing Units (GPUs), and bridging the performance gap between TTFS-based SNNs and their rate-based counterparts. To address these issues, we propose a parallel training algorithm for accelerated execution and a novel decoding strategy for enhanced performance. Specifically, we derive both forward and backward propagation equations for parallelized TTFS SNNs, enabling precise calculation of first-spike timings and gradients. Furthermore, we analyze the limitations of existing output decoders and introduce a membrane potential–based decoder, complemented by an incremental time-step training strategy, to improve accuracy. Our approach achieves state-of-the-art accuracy for TTFS SNNs on several benchmarks, including MNIST (99.51%), Fashion-MNIST (93.14%), CIFAR-10 (95.06%), and CIFAR-100 (74.07%).

Subject: AAAI.2026 - Cognitive Modeling and Cognitive Systems