teabranch/open-responses-server
Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm compliant.
This project lets developers or researchers use OpenAI's coding assistant (Codex) and other OpenAI API clients with their own self-hosted language models like Ollama or vLLM. It takes requests designed for the OpenAI API and routes them to your chosen AI backend, while also managing features like stateful chat and tool calls. This is useful for those who want to experiment with different LLMs but maintain compatibility with existing OpenAI-dependent applications.
152 stars. Available on PyPI.
Use this if you want to run OpenAI's Coding Assistant or other OpenAI API-compatible tools against your own local or private large language models.
Not ideal if you solely use OpenAI's official APIs and do not need to integrate with self-hosted or alternative AI backends.
Stars
152
Forks
20
Language
Python
License
MIT
Category
Last pushed
Nov 05, 2025
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/teabranch/open-responses-server"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related servers
QuantGeekDev/mcp-framework
A framework for writing MCP (Model Context Protocol) servers in Typescript
MCPJam/inspector
Test & Debug MCP servers, ChatGPT apps, and MCP Apps (ext-apps)
arabold/docs-mcp-server
Grounded Docs MCP Server: Open-Source Alternative to Context7, Nia, and Ref.Tools
rohitg00/awesome-devops-mcp-servers
A curated list of awesome MCP servers focused on DevOps tools and capabilities.
stabgan/openrouter-mcp-multimodal
MCP server for OpenRouter providing text chat and image analysis tools