December 3, 2021 09:40

Abstract

Title: Label-noise Learning Beyond Class-conditional Noise

Abstract:
The label-noise problem belongs to the inaccurate supervision — one of the three typical types of weak supervision. Label noise may exist in many real-world applications where budgets for labeling raw data are limited. However, the famous class-conditional noise (CCN) model, which assumes that the label corruption process (namely, the class-label flipping probability for corrupting the class-posterior probability) is instance-independent and only class-dependent, is not enough in expressing/modeling real-world label noise and thus we need to go beyond it. This talk will introduce our recent advances about robust learning against label noise when the noise is significantly harder than CCN. Specifically, two general noise models and the corresponding learning methods will be covered in the talk. The first is called instance-dependent noise (IDN), where the label flipping probability is conditioned on not only the true label but also the instance itself; IDN is non-identifiable and needs to be approximated with additional assumption and/or information. The second is called mutually contaminated distributions (MCD), where what has been corrupted is the class-conditional density for sampling instances rather than the class-posterior probability for labeling instances. The learning methods for handling IDN and MCD show that label-noise learning beyond CCN is at least possible and hopefully there will be new methods making it more and more practical.

Bio:
Gang Niu is currently a research scientist (indefinite-term) at RIKEN Center for Advanced Intelligence Project.
https://niug1984.github.io/

More Information

Date January 11, 2022 (Tue) 16:00 - 17:00
URL https://c5dc59ed978213830355fc8978.doorkeeper.jp/events/130696