TY - GEN
T1 - Revealing the real-world applicable setting of online continual learning
AU - Xu, Zhenbo
AU - Hu, Haimiao
AU - Liu, Liu
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - The motivation of online continual learning (CL) is training agents to learn from an infinite stream of data and quickly accommodate changes in the data distribution. However, current online CL datasets are synthesized by common classification datasets by splitting all classes into disjoint tasks where disjoint task streams have little temporal relations, resulting in a CL setting far from realistic. In this paper, we ask two questions: (i) What are the characteristics of real-world CL scenarios? (ii) How existing methods perform on real-world CL scenarios? To answer the first question, we propose the first realistic CL setting coined instance-based continual learning (IBCL). IBCL has no task or class boundaries and requires algorithms to predict and learn from instance streams simultaneously. The life cycles of classes under IBCL are dynamic and instances belonging to the same class might evolve over time. For each sequentially arrival instance, algorithms are required to give the recognition result and then perform changes based on its label. No additional training resource are available except for the instance stream in evaluation. To answer the second question, on CORe50 and mini-ImageNet, we compare current online CL methods under the IBCL setting with both the traditional ResNet18 backbone as well as the recent transformer-based backbone ViT on the IBCL setting. Three aspects including the recognition performance, the latency, and the memory usage of current methods are analyzed. Experiment results show that current online CL methods perform poorly in the real CL scenarios, and methods using the transformer-based backbone perform better than the CNN-based counterparts.
AB - The motivation of online continual learning (CL) is training agents to learn from an infinite stream of data and quickly accommodate changes in the data distribution. However, current online CL datasets are synthesized by common classification datasets by splitting all classes into disjoint tasks where disjoint task streams have little temporal relations, resulting in a CL setting far from realistic. In this paper, we ask two questions: (i) What are the characteristics of real-world CL scenarios? (ii) How existing methods perform on real-world CL scenarios? To answer the first question, we propose the first realistic CL setting coined instance-based continual learning (IBCL). IBCL has no task or class boundaries and requires algorithms to predict and learn from instance streams simultaneously. The life cycles of classes under IBCL are dynamic and instances belonging to the same class might evolve over time. For each sequentially arrival instance, algorithms are required to give the recognition result and then perform changes based on its label. No additional training resource are available except for the instance stream in evaluation. To answer the second question, on CORe50 and mini-ImageNet, we compare current online CL methods under the IBCL setting with both the traditional ResNet18 backbone as well as the recent transformer-based backbone ViT on the IBCL setting. Three aspects including the recognition performance, the latency, and the memory usage of current methods are analyzed. Experiment results show that current online CL methods perform poorly in the real CL scenarios, and methods using the transformer-based backbone perform better than the CNN-based counterparts.
KW - Online Continual Learning
KW - Recognition and Classification
UR - https://www.scopus.com/pages/publications/85143611837
U2 - 10.1109/MMSP55362.2022.9948735
DO - 10.1109/MMSP55362.2022.9948735
M3 - 会议稿件
AN - SCOPUS:85143611837
T3 - 2022 IEEE 24th International Workshop on Multimedia Signal Processing, MMSP 2022
BT - 2022 IEEE 24th International Workshop on Multimedia Signal Processing, MMSP 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 24th IEEE International Workshop on Multimedia Signal Processing, MMSP 2022
Y2 - 26 September 2022 through 28 September 2022
ER -