Cyclenerd/google-cloud-litellm-proxy

🚅 LiteLLM Proxy for Google Cloud Generative AI

48
/ 100
Emerging

This project helps developers integrate various Google Cloud Vertex AI Large Language Models (LLMs) like Gemini, Claude, Llama 3, and Mistral AI Large into their applications. It takes model requests formatted for the OpenAI API and routes them to the specified LLM on Google Cloud, outputting the LLM's response. A software developer or MLOps engineer looking to simplify LLM integrations would use this.

Use this if you want to use Google Cloud's Vertex AI LLMs but prefer to interact with them using the familiar OpenAI API format.

Not ideal if you prefer to use Google Cloud's native SDKs for LLM interactions or do not use Google Cloud for your AI workloads.

LLM-integration cloud-AI API-proxy model-deployment developer-tooling
No Package No Dependents
Maintenance 10 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

61

Forks

8

Language

Dockerfile

License

Apache-2.0

Category

llm-api-gateways

Last pushed

Mar 01, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Cyclenerd/google-cloud-litellm-proxy"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.