aws-samples/amazon-sagemaker-ab-testing-pipeline
Amazon SageMaker MLOps deployment pipeline for A/B Testing of machine learning models.
This project helps data scientists and machine learning engineers set up a robust system for A/B testing different machine learning models in a live environment. It takes trained machine learning models as input and provides an organized way to deploy, manage, and compare their performance in production to determine which model is most effective. This allows users to confidently roll out new or improved models, minimizing risk and maximizing impact.
No commits in the last 6 months.
Use this if you need to reliably test and compare multiple machine learning models in a production environment to decide which one performs best before full deployment.
Not ideal if you are looking for a simple, one-off model deployment without the need for comparative testing or complex MLOps infrastructure.
Stars
45
Forks
17
Language
Python
License
MIT-0
Category
Last pushed
Jun 07, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/aws-samples/amazon-sagemaker-ab-testing-pipeline"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aws-controllers-k8s/sagemaker-controller
ACK service controller for Amazon SageMaker
SuperCowPowers/workbench
Workbench: An easy to use Python API for creating and deploying AWS SageMaker Models
aws/aws-step-functions-data-science-sdk-python
Step Functions Data Science SDK for building machine learning (ML) workflows and pipelines on AWS
aws-samples/amazon-sagemaker-mlops-workshop
MLOps workshop with Amazon SageMaker
aws/sagemaker-sparkml-serving-container
This code is used to build & run a Docker container for performing predictions against a Spark...