Class-balanced-loss-pytorch and class-balanced-loss
These are independent implementations of the same academic paper with no dependencies between them, making them direct competitors for the same use case of addressing class imbalance in PyTorch models.
About Class-balanced-loss-pytorch
vandit15/Class-balanced-loss-pytorch
Pytorch implementation of the paper "Class-Balanced Loss Based on Effective Number of Samples"
This helps deep learning engineers train image classification models more effectively when some categories have far fewer examples than others. It takes your model's predictions and the true labels, then applies a specialized loss function during training. This results in a more balanced model that performs better across all classes, especially the rare ones. It's for machine learning practitioners building and training computer vision models.
About class-balanced-loss
richardaecn/class-balanced-loss
Class-Balanced Loss Based on Effective Number of Samples. CVPR 2019
When training image classification models, this project helps scientists and machine learning practitioners overcome the common problem of imbalanced datasets, where some categories have many more examples than others. It takes a dataset like CIFAR or iNaturalist and applies a special 'class-balanced' weighting to the training process. The outcome is a more accurate classification model, especially for those rare categories.
Scores updated daily from GitHub, PyPI, and npm data. How scores work