%0 Conference Proceedings %T Class-conditional Importance Weighting for Deep Learning with Noisy Labels %A Bhalaji Nagarajan %A Ricardo Marques %A Marcos Mejia %A Petia Radeva %B 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications %D 2022 %V 5 %F Bhalaji Nagarajan2022 %O MILAB; no menciona %O exported from refbase (http://158.109.8.37/show.php?record=3798), last updated on Mon, 24 Apr 2023 15:11:06 +0200 %X Large-scale accurate labels are very important to the Deep Neural Networks to train them and assure high performance. However, it is very expensive to create a clean dataset since usually it relies on human interaction. To this purpose, the labelling process is made cheap with a trade-off of having noisy labels. Learning with Noisy Labels is an active area of research being at the same time very challenging. The recent advances in Self-supervised learning and robust loss functions have helped in advancing noisy label research. In this paper, we propose a loss correction method that relies on dynamic weights computed based on the model training. We extend the existing Contrast to Divide algorithm coupled with DivideMix using a new class-conditional weighted scheme. We validate the method using the standard noise experiments and achieved encouraging results. %K Noisy Labeling %K Loss Correction %K Class-conditional Importance Weighting %K Learning with Noisy Labels %U https://www.scitepress.org/Link.aspx?doi=10.5220/0010996400003124 %U http://dx.doi.org/10.5220/0010996400003124 %P 679-686