2025.emnlp-main.950@ACL

Total: 1

#1 On Pruning State-Space LLMs [PDF] [Copy] [Kimi] [REL]

Authors: Tamer Ghattas, Michael Hassid, Roy Schwartz

Recent work proposed state-space models (SSMs) as an efficient alternative to transformer-based LLMs. Can these models be pruned to further reduce their computation costs? We adapt several pruning methods to the SSM structure, and apply them to four SSM-based LLMs across multiple tasks. We find that such models are quite robust to some pruning methods (e.g., WANDA), while using other methods lead to fast performance degradation.

Subject: EMNLP.2025 - Main