YVZbaVikBp@OpenReview

Total: 1

#1 Generalization Error Analysis for Selective State-Space Models Through the Lens of Attention [PDF] [Copy] [Kimi] [REL]

Authors: Arya Honarpisheh, Mustafa Bozdag, Octavia Camps, Mario Sznaier

State-space models (SSMs) have recently emerged as a compelling alternative to Transformers for sequence modeling tasks. This paper presents a theoretical generalization analysis of selective SSMs, the core architectural component behind the Mamba model. We derive a novel covering number-based generalization bound for selective SSMs, building upon recent theoretical advances in the analysis of Transformer models. Using this result, we analyze how the spectral abscissa of the continuous-time state matrix influences the model’s stability during training and its ability to generalize across sequence lengths. We empirically validate our findings on a synthetic majority task, the IMDb sentiment classification benchmark, and the ListOps task, demonstrating how our theoretical insights translate into practical model behavior.

Subject: NeurIPS.2025 - Poster