Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
Awesome-Noisy-Labels
A Survey
https://github.com/songhwanjun/Awesome-Noisy-Labels
Last synced: about 7 hours ago
JSON representation
-
A. [Robust Architecture](#content)
- Deep learning from noisy image labels with quality embedding
- Deep learning from noisy image labels with quality embedding
- Deep learning from noisy image labels with quality embedding
- Webly supervised learning of convolutional networks - supervised) |
- Training convolutional networks with noisy labels - cnn-noisy-labels-keras) |
- Learning deep networks from noisy labels with dropout regularization
- Training deep neural-networks based on unreliable labels - Ito/Noisy-Labels-Neural-Network) |
- Training deep neural-networks using a noise adaptation layer
- Learning from massive noisy labeled data for image classification
- Masking: A new perspective of noisy supervision
- Deep learning from noisy image labels with quality embedding
- Robust inference via generative classifiers for handling noisy labels
- Deep learning from noisy image labels with quality embedding
- Deep learning from noisy image labels with quality embedding
- Deep learning from noisy image labels with quality embedding
- Learning deep networks from noisy labels with dropout regularization
-
D. [Loss Adjustment](#content)
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Making deep neural networks robust to label noise: A loss correction approach - correction) |
- Dual T: Reducing estimation error for transition matrix in label-noise learning
- Tackling Instance-Dependent Label Noise via a Universal Probabilistic Model - dependent-label-noise) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Active Bias: Training more accurate neural networks by emphasizing high variance samples
- Dimensionality-driven learning with noisy labels - driven-learning) |
- Unsupervised label noise modeling and loss correction
- Self-adaptive training: beyond empirical risk minimization - adaptive-training) |
- Error-bounded correction of noisy labels
- Beyond class-conditional assumption: A primary attempt to combat instancedependent label noise
- Learning to learn from weak supervision by full supervision - by-weak-supervision) |
- Learning from noisy labels with distillation
- Learning to reweight examples for robust deep learning - research/learning-to-reweight-examples) |
- Meta-Weight-Net: Learning an explicit mapping for sample weighting - weight-net) |
- Distilling effective supervision from severe label noise - research/google-research/tree/master/ieg) |
- Meta label correction for noisy label learning
- Making deep neural networks robust to label noise: A loss correction approach - correction) |
- Using trusted data to train deep networks on labels corrupted by severe noise
- Are anchor points really indispensable in label-noise learning? - Revision) |
- Dual T: Reducing estimation error for transition matrix in label-noise learning
- Tackling Instance-Dependent Label Noise via a Universal Probabilistic Model - dependent-label-noise) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Active Bias: Training more accurate neural networks by emphasizing high variance samples
- Training deep neural networks on noisy labels with bootstrapping - darryl-wright/Noisy-Labels-with-Bootstrapping) |
- Distilling effective supervision from severe label noise - research/google-research/tree/master/ieg) |
- Adaptive Label Noise Cleaning with Meta-Supervision for Deep Face Recognition
- Adaptive Label Noise Cleaning with Meta-Supervision for Deep Face Recognition
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Multiclass learning with partially corrupted labels - with-noisy-labels-by-importance-reweighting) |
- Meta-Weight-Net: Learning an explicit mapping for sample weighting - weight-net) |
-
E. [Sample Selection](#content)
-
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- FINE Samples for Learning with Noisy Labels
- Sample Selection with Uncertainty of Losses for Learning with Noisy Labels
- MentorNet: Learning data-driven curriculum for very deep neural networks on corrupted labels
- Co-teaching: Robust training of deep neural networks with extremely noisy labels - teaching) |
- How does disagreement help generalization against label corruption?
- Jo-SRC: A Contrastive Approach for Combating Noisy Labels - Machine-Intelligence-Laboratory/Jo-SRC) |
- Iterative learning with open-set noisy labels
- Learning with bad training data via iterative trimmed loss minimization - shen/ITLM-simplecode) |
- Understanding and utilizing deep neural networks trained with noisy labels
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- How does early stopping can help generalization against label noise?
- A topological filter for learning with label noise
- Learning with Instance-Dependent Label Noise: A Sample Sieve Approach - REAL/cores) |
- SELFIE: Refurbishing unclean samples for robust deep learning - dmlab/SELFIE) |
- SELF: Learning to filter noisy labels with self-ensembling
- DivideMix: Learning with noisy labels as semi-supervised learning
- Robust curriculum learning: from clean label detection to noisy label self-correction
- Understanding and Improving Early Stopping for Learning with Noisy Labels
- MentorNet: Learning data-driven curriculum for very deep neural networks on corrupted labels
- How does disagreement help generalization against label corruption?
- Iterative learning with open-set noisy labels
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
- O2U-Net: A simple noisy label detection approach for deep neural networks - Net) |
-
[Theoretical or Empirical Understanding](#content)
-
-
List of Papers with Categorization
-
B. [Robust Regularization](#content)
- Deep bilevel learning
- Learning from noisy labels by regularized estimation of annotator confusion
- Using pre-training can improve model robustness and uncertainty - training) |
- Can gradient clipping mitigate label noise?
- Wasserstein adversarial regularization (WAR) on label noise
- Robust early-learning: Hindering the memorization of noisy labels
- When Optimizing f-Divergence is Robust with Label Noise - f-divergence-measures) |
- Learning with Noisy Labels via Sparse Regularization
- Open-set Label Noise Can Improve Robustness Against Inherent Label Noise
- Explaining and harnessing adversarial examples - examples-pytorch) |
- Regularizing neural networks by penalizing confident output distributions - loss) |
- Mixup: Beyond empirical risk minimization - cifar10) |
- Augmentation Strategies for Learning with Noisy Labels - for-LNL) |
- AutoDO: Robust AutoAugment for Biased Data With Label Noise via Scalable Probabilistic Implicit Differentiation
-
C. [Robust Loss Function](#content)
- Robust loss functions under label noise for deep neural networks
- Symmetric cross entropy for robust learning with noisy labels
- Generalized cross entropy loss for training deep neural networks with noisy labels - Loss) |
- An Information Fusion Approach to Learning with Instance-Dependent Label Noise
- Curriculum loss: Robust learning and generalization against label corruption
- Normalized loss functions for deep learning with noisy labels - Passive-Losses) |
- Peer loss functions: Learning from noisy labels without knowing noise rates
- Learning Cross-Modal Retrieval with Noisy Labels - SCU/2021-CVPR-MRL) |
- A Second-Order Approach to Learning With Instance-Dependent Label Noise - REAL/CAL) |
-
Learning from Noisy Labels towards Realistic Scenarios
-
[Theoretical or Empirical Understanding](#content)
- Online Continual Learning on a Contaminated Data Stream with Blurry Task Boundaries
- Learning from Multiple Annotator Noisy Labels via Sample-wise Label Fusion - from-Multiple-Annotator-Noisy-Labels)]</br>
-
Categories
Sub Categories