@Misc{MertKilickaya2023, author="Mert Kilickaya and Joost van de Weijer and Yuki M. Asano", title="Towards Label-Efficient Incremental Learning: A Survey", year="2023", abstract="The current dominant paradigm when building a machine learning model is to iterate over a dataset over and over until convergence. Such an approach is non-incremental, as it assumes access to all images of all categories at once. However, for many applications, non-incremental learning is unrealistic. To that end, researchers study incremental learning, where a learner is required to adapt to an incoming stream of data with a varying distribution while preventing forgetting of past knowledge. Significant progress has been made, however, the vast majority of works focus on the fully supervised setting, making these algorithms label-hungry thus limiting their real-life deployment. To that end, in this paper, we make the first attempt to survey recently growing interest in label-efficient incremental learning. We identify three subdivisions, namely semi-, few-shot- and self-supervised learning to reduce labeling efforts. Finally, we identify novel directions that can further enhance label-efficiency and improve incremental learning scalability. Project website: this https URL.", optnote="LAMP", optnote="exported from refbase (http://158.109.8.37/show.php?record=3994), last updated on Wed, 31 Jan 2024 10:45:14 +0100", opturl="https://arxiv.org/abs/2302.00353", file=":http://158.109.8.37/files/KWA2023.pdf:PDF" }