Modern Deep Neural Networks struggle to retain knowledge in streaming data environments, often leading to forgetting during incremental training. Most Continual Learning (CL) approaches address this issue by rehearsing past data – stored in a replay buffer – while acquiring new knowledge. However, in practical scenarios, noisy labels can contaminate the replay buffer, undermining performance. This work builds upon the previous “May the Forgetting Be with You”, designed to tackle Continual Learning with Noisy Labels (CLN). By leveraging the distinct learning dynamics between correctly and incorrectly labeled examples, the method induces targeted forgetting to identify and filter out noisy labels. We propose EARL, which improves on its predecessor by introducing i) a detailed analysis of the learning dynamics occurring in the presence of noise, ii) a robust analysis under more realistic noise conditions, iii) an evaluation of performance using pre-trained backbones and modern prompt-based CL baselines, iv) a detailed study on the influence of different sampling strategies, v) experiments on Natural Language Processing (NLP) benchmarks. This work unravels the motivations and findings of the previous research, shedding light on the effectiveness of its components in achieving high performance and minimizing forgetting.
EARL: Embracing amnesic replay for learning with noisy labels / Millunzi, Monica; Bonicelli, Lorenzo; Porrello, Angelo; Credi, Jacopo; Kolm, Petter N.; Calderara, Simone. - In: PATTERN RECOGNITION. - ISSN 0031-3203. - 179:(2026), pp. 113514-113514. [10.1016/j.patcog.2026.113514]
EARL: Embracing amnesic replay for learning with noisy labels
Monica Millunzi;Lorenzo Bonicelli;Angelo Porrello;Jacopo Credi;Simone Calderara
2026
Abstract
Modern Deep Neural Networks struggle to retain knowledge in streaming data environments, often leading to forgetting during incremental training. Most Continual Learning (CL) approaches address this issue by rehearsing past data – stored in a replay buffer – while acquiring new knowledge. However, in practical scenarios, noisy labels can contaminate the replay buffer, undermining performance. This work builds upon the previous “May the Forgetting Be with You”, designed to tackle Continual Learning with Noisy Labels (CLN). By leveraging the distinct learning dynamics between correctly and incorrectly labeled examples, the method induces targeted forgetting to identify and filter out noisy labels. We propose EARL, which improves on its predecessor by introducing i) a detailed analysis of the learning dynamics occurring in the presence of noise, ii) a robust analysis under more realistic noise conditions, iii) an evaluation of performance using pre-trained backbones and modern prompt-based CL baselines, iv) a detailed study on the influence of different sampling strategies, v) experiments on Natural Language Processing (NLP) benchmarks. This work unravels the motivations and findings of the previous research, shedding light on the effectiveness of its components in achieving high performance and minimizing forgetting.| File | Dimensione | Formato | |
|---|---|---|---|
|
1-s2.0-S0031320326004802-main.pdf
Open access
Tipologia:
VOR - Versione pubblicata dall'editore
Licenza:
[IR] creative-commons
Dimensione
2.02 MB
Formato
Adobe PDF
|
2.02 MB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate

I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris




