sUBuOCquHX@OpenReview

Total: 1

#1 Predicting the Susceptibility of Examples to Catastrophic Forgetting [PDF1] [Copy] [Kimi] [REL]

Authors: Guy Hacohen, Tinne Tuytelaars

Catastrophic forgetting -- the tendency of neural networks to forget previously learned data when learning new information -- remains a central challenge in continual learning. In this work, we adopt a behavioral approach, observing a connection between learning speed and forgetting: examples learned more quickly are less prone to forgetting. Focusing on replay-based continual learning, we show that the composition of the replay buffer -- specifically, whether it contains quickly or slowly learned examples -- has a significant effect on forgetting. Motivated by this insight, we introduce Speed-Based Sampling (SBS), a simple yet general strategy that selects replay examples based on their learning speed. SBS integrates easily into existing buffer-based methods and improves performance across a wide range of competitive continual learning benchmarks, advancing state-of-the-art results. Our findings underscore the value of accounting for the forgetting dynamics when designing continual learning algorithms.

Subject: ICML.2025 - Poster