aporia-ai/inferencedb

🚀 Stream inferences of real-time ML models in production to any data lake (Experimental)

31
/ 100
Emerging

This tool helps machine learning engineers and MLOps teams automatically send the predictions and inputs from real-time ML models in production into a data lake. It takes streaming model inference data from Kafka and stores it in formats like Parquet on S3. This enables crucial tasks like model retraining, drift monitoring, and performance tracking.

No commits in the last 6 months.

Use this if you need to reliably capture all of your live machine learning model's inputs and outputs for later analysis, auditing, or model improvement cycles.

Not ideal if your models are not real-time, you don't use Kafka for data streams, or you're looking for an all-in-one MLOps platform rather than a specialized inference logging tool.

MLOps Machine Learning Engineering Data Lakes Model Monitoring Real-time Analytics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

81

Forks

3

Language

Python

License

Last pushed

Jun 10, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mlops/aporia-ai/inferencedb"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.