Mrabah_Sparsity_Outperforms_Low-Rank_Projections_in_Few-Shot_Adaptation@ICCV2025@CVF

Total: 1

#1 Sparsity Outperforms Low-Rank Projections in Few-Shot Adaptation [PDF1] [Copy] [Kimi] [REL]

Authors: Nairouz Mrabah, Nicolas Richet, Ismail Ben Ayed, Eric Granger

Adapting Vision-Language Models (VLMs) to new domains with few labeled samples remains a significant challenge due to severe overfitting and computational constraints. State-of-the-art solutions, such as low-rank reparameterization, mitigate these issues but often struggle with generalization and require extensive hyperparameter tuning. In this paper, a novel Sparse Optimization (SO) framework is proposed. Unlike low-rank approaches that typically constrain updates to a fixed subspace, our SO method leverages high sparsity to dynamically adjust very few parameters. We introduce two key paradigms. First, we advocate for local sparsity and global density, which updates a minimal subset of parameters per iteration while maintaining overall model expressiveness. As a second paradigm, we advocate for local randomness and global importance, which sparsifies the gradient using random selection while pruning the first moment based on importance. This combination significantly mitigates overfitting and ensures stable adaptation in low-data regimes. Extensive experiments on 11 diverse datasets show that SO achieves state-of-the-art few-shot adaptation performance while reducing memory overhead.

Subject: ICCV.2025 - Poster