Zhang_FedAGC_Federated_Continual_Learning_with_Asymmetric_Gradient_Correction@ICCV2025@CVF

Total: 1

#1 FedAGC: Federated Continual Learning with Asymmetric Gradient Correction [PDF] [Copy] [Kimi] [REL]

Authors: Chengchao Zhang, Fanhua Shang, Hongying Liu, Liang Wan, Wei Feng

Federated Continual Learning (FCL) has emerged as a prominent distributed learning paradigm and aims at addressing model learning challenges in both federated and continual learning settings. Efficient personalization in FCL remains a major challenge, as it must handle not only conflicts between old and new knowledge within parallel task streams but also heterogeneous knowledge conflicts from different clients. Recent approaches attempt to mitigate these issues through gradient correction. However, they often overlook the combined impact of gradient magnitude and direction, leading to unsatisfactory gradient solutions. To address these issues, we propose a novel federated continual learning method (called FedAGC) with asymmetric gradient correction, which performs memory rehearsal using representative samples selected via a centroid-based approach from historical tasks. By formulating the problem as a multi-objective optimization problem, FedAGC derives more effective gradients while incorporating group-level personalization to facilitate useful knowledge integration and irrelevant knowledge isolation, effectively mitigating both temporal and spatial catastrophic forgetting. Extensive experiments confirm the effectiveness of FedAGC.

Subject: ICCV.2025 - Poster