Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Catastrophic forgetting is currently the greatest challenge faced in Exemplar-Free Class Incremental Learning (EFCIL), which does not allow the replay of old data from previous tasks because of factors such as user privacy and device capacity limitations. In this paper, we propose a Comprehensive Ensemble Framework for exemplar-free Class Incremental Learning (CEFCIL), which includes an ensemble Nearest Class Mean (NCM) classifier based on the Mahalanobis metric with a given number of diversified base networks, a cached root model consisting of initialized base networks for root knowledge distillation, a dual knowledge distillation strategy, and a dimensional collapse prevention strategy. Across diverse experimental conditions, CEFCIL exhibits superior performance in EFCIL and possesses robust cross-domain capabilities.
The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).
Comments on this article