GewoonJaap/codex-openai-wrapper

Wrap OpenAI's Codex to an OpenAI compatible endpoint using Cloudflare Workers

43
/ 100
Emerging

This project helps developers integrate OpenAI's Codex models into their applications or services without modifying existing code. It acts as a go-between, taking standard OpenAI API requests and translating them to use the powerful Codex models, then returning the results in a format that looks just like a regular OpenAI response. Developers and system administrators can use this to leverage advanced AI reasoning capabilities within their current systems, even those designed for local or third-party AI models.

Use this if you want to access OpenAI's Codex models with advanced reasoning features through an OpenAI-compatible API, allowing for drop-in replacement in existing AI-powered applications or local AI model workflows.

Not ideal if you prefer direct API integration without an intermediary layer or if you only need basic OpenAI API access without specific Codex reasoning capabilities.

AI-integration API-proxy AI-model-deployment developer-workflow backend-engineering
No License No Package No Dependents
Maintenance 10 / 25
Adoption 9 / 25
Maturity 7 / 25
Community 17 / 25

How are scores calculated?

Stars

70

Forks

13

Language

TypeScript

License

Last pushed

Feb 16, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/GewoonJaap/codex-openai-wrapper"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.