Wang_Integrating_Task-Specific_and_Universal_Adapters_for_Pre-Trained_Model-based_Class-Incremental_Learning@ICCV2025@CVF

Total: 1

#1 Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning [PDF] [Copy] [Kimi] [REL]

Authors: Yan Wang, Da-Wei Zhou, Han-Jia Ye

Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Existing pre-trained model-based CIL methods often freeze the pre-trained network and adapt to incremental tasks using additional lightweight modules such as adapters. However, incorrect module selection during inference hurts performance, and task-specific modules often overlook shared general knowledge, leading to errors on distinguishing between similar classes across tasks. To address the aforementioned challenges, we propose integrating Task-Specific and Universal Adapters (TUNA) in this paper. Specifically, we train task-specific adapters to capture the most crucial features relevant to their respective tasks and introduce an entropy-based selection mechanism to choose the most suitable adapter. Furthermore, we leverage an adapter fusion strategy to construct a universal adapter, which encodes the most discriminative features shared across tasks. We combine task-specific and universal adapter predictions to harness both specialized and general knowledge during inference. Extensive experiments on various benchmark datasets demonstrate the state-of-the-art performance of our approach. Code is available at https://github.com/LAMDA-CL/ICCV2025-TUNA

Subject: ICCV.2025 - Poster