aws-samples/amazon-sagemaker-ab-testing-pipeline

Amazon SageMaker MLOps deployment pipeline for A/B Testing of machine learning models.

43
/ 100
Emerging

This project helps data scientists and machine learning engineers set up a robust system for A/B testing different machine learning models in a live environment. It takes trained machine learning models as input and provides an organized way to deploy, manage, and compare their performance in production to determine which model is most effective. This allows users to confidently roll out new or improved models, minimizing risk and maximizing impact.

No commits in the last 6 months.

Use this if you need to reliably test and compare multiple machine learning models in a production environment to decide which one performs best before full deployment.

Not ideal if you are looking for a simple, one-off model deployment without the need for comparative testing or complex MLOps infrastructure.

A/B testing machine learning operations model deployment data science workflow model experimentation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

45

Forks

17

Language

Python

License

MIT-0

Last pushed

Jun 07, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mlops/aws-samples/amazon-sagemaker-ab-testing-pipeline"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.