simple_tensorflow_serving and tfserve

simple_tensorflow_serving
51
Established
tfserve
49
Emerging
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 0/25
Adoption 7/25
Maturity 25/25
Community 17/25
Stars: 758
Forks: 186
Downloads:
Commits (30d): 0
Language: JavaScript
License: Apache-2.0
Stars: 36
Forks: 10
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m No Package No Dependents
Stale 6m

About simple_tensorflow_serving

tobegit3hub/simple_tensorflow_serving

Generic and easy-to-use serving service for machine learning models

This project helps machine learning engineers and MLOps professionals deploy their trained machine learning models, regardless of the framework used, into a live environment. You input your pre-trained model files, and it creates a web service (API endpoint) that other applications can call to get predictions. This allows your models to be integrated into larger systems, providing real-time intelligence or automated decision-making.

MLOps Model Deployment Real-time Inference Machine Learning Engineering API Development

About tfserve

iitzco/tfserve

Serve TF models simple and easy as an HTTP API

This tool helps machine learning engineers and data scientists deploy TensorFlow models as simple HTTP APIs. You provide a trained TensorFlow model (a .pb file or checkpoint directory) and specify the input/output tensor names. It then handles incoming data, passes it through your model, and returns the predictions as an HTTP response.

Machine Learning Deployment Model Serving Real-time Inference Deep Learning Operations

Scores updated daily from GitHub, PyPI, and npm data. How scores work