wujianguo/openai-proxy

Proxy for OpenAI api using python flask, supports SSE streaming.

38
/ 100
Emerging

This proxy allows you to route requests to the OpenAI API through your own server. You provide your OpenAI API key and the chat messages, and it returns the AI's responses, including real-time streaming answers. This is for developers and system administrators who want more control over their OpenAI API traffic.

No commits in the last 6 months.

Use this if you need to manage or monitor OpenAI API calls centrally, or want to provide a consistent endpoint for your applications.

Not ideal if you are simply looking for a direct way to use the OpenAI API without any intermediate layers.

API management developer tools system administration AI integration network proxy
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

18

Forks

7

Language

Python

License

MIT

Last pushed

Apr 26, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/wujianguo/openai-proxy"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.