teticio/openai-proxy
OpenAI API proxy for fine-grained cost tracking & control and caching of responses
This project helps engineering and product teams manage their OpenAI API expenses more effectively. It acts as a middleman, taking your API requests and passing them to OpenAI, while tracking costs by user, project, and the specific AI model used. This allows team leads and CTOs to gain visibility and control over how much is spent on different initiatives and by whom.
No commits in the last 6 months.
Use this if you need to track and limit OpenAI API costs across different projects, users, or development stages within your organization, or if you want to cache responses to save on repeated calls.
Not ideal if you are an individual user with simple API needs and don't require detailed cost tracking or management for multiple projects/users.
Stars
17
Forks
2
Language
Python
License
BSD-3-Clause
Category
Last pushed
Mar 11, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/teticio/openai-proxy"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xllm-go/bypass
集成了openai-api、coze、deepseek、cursor、windsurf、qodo、blackbox、you、grok、bing 绘画 多款AI的聊天逆向接口适配到...
orkunkinay/openai_cost_calculator
Calculate exact USD cost of each OpenAI API call — no guesswork.
Mon-ius/Docker-Warp-Socks
Connet to CloudFlare WARP, exposing `socks5` proxy all together.
x-dr/chatgptProxyAPI
🔥 使用cloudflare 搭建免费的 OpenAI api代理 ,解决网络无法访问问题。支持流式输出
qingchencloud/cj2api
将 ChatJimmy 转换为 OpenAI 兼容 API 的 Cloudflare Worker | 零成本部署,支持流式输出,自带测试页