TY - GEN
T1 - Evolutionary NAS in Light of Model Stability for Accurate Continual Learning
AU - Du, Xiaocong
AU - Li, Zheng
AU - Sun, Jingbo
AU - Liu, Frank
AU - Cao, Yu
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - Continual learning, the capability to learn new knowledge from streaming data without forgetting the previous knowledge, is a critical requirement for dynamic learning systems, especially for emerging edge devices such as self-driving cars and drones. However, continual learning is still facing the catastrophic forgetting problem. Previous work illustrate that model performance on continual learning is not only related to the learning algorithms but also strongly dependent on the inherited model, i.e., the model where continual learning starts. The better stability of the inherited model, the less catastrophic forgetting and thus, the inherited model should be elaborately selected. Inspired by this finding, we develop an evolutionary neural architecture search (ENAS) algorithm that emphasizes the Stability of the inherited model, namely ENAS-S. ENAS-S aims to find optimal architectures for accurate continual learning on edge devices. On CIFAR-10 and CIFAR-100, we present that ENAS-S achieves competitive architectures with lower catastrophic forgetting and smaller model size when learning from a data stream, as compared with handcrafted DNNs.
AB - Continual learning, the capability to learn new knowledge from streaming data without forgetting the previous knowledge, is a critical requirement for dynamic learning systems, especially for emerging edge devices such as self-driving cars and drones. However, continual learning is still facing the catastrophic forgetting problem. Previous work illustrate that model performance on continual learning is not only related to the learning algorithms but also strongly dependent on the inherited model, i.e., the model where continual learning starts. The better stability of the inherited model, the less catastrophic forgetting and thus, the inherited model should be elaborately selected. Inspired by this finding, we develop an evolutionary neural architecture search (ENAS) algorithm that emphasizes the Stability of the inherited model, namely ENAS-S. ENAS-S aims to find optimal architectures for accurate continual learning on edge devices. On CIFAR-10 and CIFAR-100, we present that ENAS-S achieves competitive architectures with lower catastrophic forgetting and smaller model size when learning from a data stream, as compared with handcrafted DNNs.
KW - continual learning
KW - Deep neural network
KW - neural architecture search
KW - online learning
UR - http://www.scopus.com/inward/record.url?scp=85116468042&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85116468042&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9534079
DO - 10.1109/IJCNN52387.2021.9534079
M3 - Conference contribution
AN - SCOPUS:85116468042
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
Y2 - 18 July 2021 through 22 July 2021
ER -