pith. machine review for the scientific record. sign in

arxiv: 1712.05055 · v2 · submitted 2017-12-14 · 💻 cs.CV

Recognition: unknown

MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels

Authors on Pith no claims yet
classification 💻 cs.CV
keywords mentornetcurriculumdeeplabelsnetworkscorruptedstudentnettraining
0
0 comments X
read the original abstract

Recent deep networks are capable of memorizing the entire data even when the labels are completely random. To overcome the overfitting on corrupted labels, we propose a novel technique of learning another neural network, called MentorNet, to supervise the training of the base deep networks, namely, StudentNet. During training, MentorNet provides a curriculum (sample weighting scheme) for StudentNet to focus on the sample the label of which is probably correct. Unlike the existing curriculum that is usually predefined by human experts, MentorNet learns a data-driven curriculum dynamically with StudentNet. Experimental results demonstrate that our approach can significantly improve the generalization performance of deep networks trained on corrupted training data. Notably, to the best of our knowledge, we achieve the best-published result on WebVision, a large benchmark containing 2.2 million images of real-world noisy labels. The code are at https://github.com/google/mentornet

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Sharpness-Aware Minimization for Efficiently Improving Generalization

    cs.LG 2020-10 conditional novelty 6.0

    SAM solves a min-max problem to locate flat low-loss regions, improving generalization on CIFAR, ImageNet and label-noise tasks.