kkirchheim/pytorch-ood

👽 Out-of-Distribution Detection with PyTorch

60
/ 100
Established

This tool helps machine learning engineers and researchers validate the robustness of their deep neural networks. It takes your trained PyTorch deep learning model and new, potentially unfamiliar data, and then identifies inputs that are 'out-of-distribution' or unexpected. The output is a score for each input, indicating how likely it is to be an outlier, helping you build more reliable AI systems.

335 stars. Available on PyPI.

Use this if you need to ensure your deep learning models are reliable and can effectively flag data that falls outside their trained expertise, rather than making confident but incorrect predictions on novel inputs.

Not ideal if you are looking for a general-purpose anomaly detection tool for non-image or non-deep learning data types, or if you don't work with PyTorch.

deep-learning-reliability model-validation anomaly-detection open-set-recognition machine-learning-engineering
Maintenance 10 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 15 / 25

How are scores calculated?

Stars

335

Forks

32

Language

Python

License

Apache-2.0

Last pushed

Jan 19, 2026

Commits (30d)

0

Dependencies

5

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/kkirchheim/pytorch-ood"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.