coteaching#

Implements the co-teaching algorithm for training neural networks on noisily-labeled data (Han et al., 2018). This module requires PyTorch (https://pytorch.org/get-started/locally/). Example using this algorithm with cleanlab to achieve state of the art on CIFAR-10 for learning with noisy labels is provided within: https://github.com/cleanlab/examples/

cifar_cnn.py provides an example model that can be trained via this algorithm.

Functions:

loss_coteaching(y_1, y_2, t, forget_rate[, ...])

Co-Teaching Loss function.

initialize_lr_scheduler([lr, epochs, ...])

Scheduler to adjust learning rate and betas for Adam Optimizer

adjust_learning_rate(optimizer, epoch, ...)

Scheduler to adjust learning rate and betas for Adam Optimizer

forget_rate_scheduler(epochs, forget_rate, ...)

Tells Co-Teaching what fraction of examples to forget at each epoch.

train(train_loader, epoch, model1, ...)

PyTorch training function.

evaluate(test_loader, model1, model2)

cleanlab.experimental.coteaching.loss_coteaching(y_1, y_2, t, forget_rate, class_weights=None)[source]#

Co-Teaching Loss function.

Parameters:
  • y_1 (Tensor array) – Output logits from model 1

  • y_2 (Tensor array) – Output logits from model 2

  • t (np.ndarray) – List of Noisy Labels (t means targets)

  • forget_rate (float) – Decimal between 0 and 1 for how quickly the models forget what they learn. Just use rate_schedule[epoch] for this value

  • class_weights (Tensor array, shape (Number of classes x 1), Default: None) – A np.torch.tensor list of length number of classes with weights

cleanlab.experimental.coteaching.initialize_lr_scheduler(lr=0.001, epochs=250, epoch_decay_start=80)[source]#

Scheduler to adjust learning rate and betas for Adam Optimizer

cleanlab.experimental.coteaching.adjust_learning_rate(optimizer, epoch, alpha_plan, beta1_plan)[source]#

Scheduler to adjust learning rate and betas for Adam Optimizer

cleanlab.experimental.coteaching.forget_rate_scheduler(epochs, forget_rate, num_gradual, exponent)[source]#

Tells Co-Teaching what fraction of examples to forget at each epoch.

cleanlab.experimental.coteaching.train(train_loader, epoch, model1, optimizer1, model2, optimizer2, args, forget_rate_schedule, class_weights, accuracy)[source]#

PyTorch training function.

Parameters:
  • train_loader (torch.utils.data.DataLoader) –

  • epoch (int) –

  • model1 (PyTorch class inheriting nn.Module) – Must define __init__ and forward(self, x,)

  • optimizer1 (PyTorch torch.optim.Adam) –

  • model2 (PyTorch class inheriting nn.Module) – Must define __init__ and forward(self, x,)

  • optimizer2 (PyTorch torch.optim.Adam) –

  • args (parser.parse_args() object) – Must contain num_iter_per_epoch, print_freq, and epochs

  • forget_rate_schedule (np.ndarray of length number of epochs) – Tells Co-Teaching loss what fraction of examples to forget about.

  • class_weights (Tensor array, shape (Number of classes x 1), Default: None) – A np.torch.tensor list of length number of classes with weights

  • accuracy (function) – A function of the form accuracy(output, target, topk=(1,)) for computing top1 and top5 accuracy given output and true targets.

cleanlab.experimental.coteaching.evaluate(test_loader, model1, model2)[source]#