YTbLri0siT@OpenReview

Total: 1

#1 Spike-timing-dependent Hebbian learning as noisy gradient descent [PDF] [Copy] [Kimi] [REL]

Authors: Niklas Dexheimer, Sascha Gaudlitz, Johannes Schmidt-Hieber

Hebbian learning is a key principle underlying learning in biological neural networks. We relate a Hebbian spike-timing-dependent plasticity rule to noisy gradient descent with respect to a non-convex loss function on the probability simplex. Despite the constant injection of noise and the non-convexity of the underlying optimization problem, one can rigorously prove that the considered Hebbian learning dynamic identifies the presynaptic neuron with the highest activity and that the convergence is exponentially fast in the number of iterations. This is non-standard and surprising as typically noisy gradient descent with fixed noise level only converges to a stationary regime where the noise causes the dynamic to fluctuate around a minimiser.

Subject: NeurIPS.2025 - Poster