Li, ZiweiXu, JiayiChao, Wei-LunShen, Han-WeiBujack, RoxanaArchambault, DanielSchreck, Tobias2023-06-102023-06-1020231467-8659https://doi.org/10.1111/cgf.14842https://diglib.eg.org:443/handle/10.1111/cgf14842Task-incremental learning (Task-IL) aims to enable an intelligent agent to continuously accumulate knowledge from new learning tasks without catastrophically forgetting what it has learned in the past. It has drawn increasing attention in recent years, with many algorithms being proposed to mitigate neural network forgetting. However, none of the existing strategies is able to completely eliminate the issues. Moreover, explaining and fully understanding what knowledge and how it is being forgotten during the incremental learning process still remains under-explored. In this paper, we propose KnowledgeDrift, a visual analytics framework, to interpret the network forgetting with three objectives: (1) to identify when the network fails to memorize the past knowledge, (2) to visualize what information has been forgotten, and (3) to diagnose how knowledge attained in the new model interferes with the one learned in the past. Our analytical framework first identifies the occurrence of forgetting by tracking the task performance under the incremental learning process and then provides in-depth inspections of drifted information via various levels of data granularity. KnowledgeDrift allows analysts and model developers to enhance their understanding of network forgetting and compare the performance of different incremental learning algorithms. Three case studies are conducted in the paper to further provide insights and guidance for users to effectively diagnose catastrophic forgetting over time.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies -> Visual analytics; Theory of computation -> Continual learningComputing methodologiesVisual analyticsTheory of computationContinual learningVisual Analytics on Network Forgetting for Task-Incremental Learning10.1111/cgf.14842437-44812 pages