%0 Conference Proceedings %T ScrollNet: DynamicWeight Importance for Continual Learning %A Fei Wang %A Kai Wang %A Joost Van de Weijer %B Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops %D 2023 %F Fei Wang2023 %O LAMP %O exported from refbase (http://158.109.8.37/show.php?record=3945), last updated on Thu, 25 Jan 2024 18:40:20 +0100 %X The principle underlying most existing continual learning (CL) methods is to prioritize stability by penalizing changes in parameters crucial to old tasks, while allowing for plasticity in other parameters. The importance of weights for each task can be determined either explicitly through learning a task-specific mask during training (e.g., parameter isolation-based approaches) or implicitly by introducing a regularization term (e.g., regularization-based approaches). However, all these methods assume that the importance of weights for each task is unknown prior to data exposure. In this paper, we propose ScrollNet as a scrolling neural network for continual learning. ScrollNet can be seen as a dynamic network that assigns the ranking of weight importance for each task before data exposure, thus achieving a more favorable stability-plasticity tradeoff during sequential task learning by reassigning this ranking for different tasks. Additionally, we demonstrate that ScrollNet can be combined with various CL methods, including regularization-based and replay-based approaches. Experimental results on CIFAR100 and TinyImagenet datasets show the effectiveness of our proposed method. %U https://openaccess.thecvf.com/content/ICCV2023W/VCL/html/Yang_ScrollNet_DynamicWeight_Importance_for_Continual_Learning_ICCVW_2023_paper.html %U http://158.109.8.37/files/WWW2023.pdf %P 3345-3355