This paper presents a learning model of multitask pattern recognition (MTPR) which is constructed by several neural classifiers, long-term memories, and the detector of task changes. In the MTPR problem, several multi-class classification tasks are sequentially given to the learning model without notifying their task categories. This implies that the learning model is supposed to detect task changes by itself and to utilize the knowledge on the previous tasks for learning of new tasks. In addition, the learning model must acquire knowledge of multiple tasks incrementally without unexpected forgetting under the condition that not only tasks but also training samples are sequentially given. The proposed model is evaluated for two artificial MTPR problem. In the experiments, we verify that the proposed model can acquire and accumulate task knowledge very stably and the speed of knowledge acquisition for new tasks is enhanced by transferring knowledge.