modpotato/copilot-proxy
copilot proxy for glm coding plan, will support more in the future (maybe?)
This tool allows developers to use various GLM coding models directly within GitHub Copilot Chat in VS Code. It acts as a bridge, letting Copilot Chat (which normally expects the Ollama API) communicate with a Z.AI Coding Plan backend. The input is your natural language coding requests in Copilot Chat, and the output is code suggestions and completions generated by powerful GLM models.
Available on PyPI.
Use this if you are a developer using GitHub Copilot Chat and want to leverage the advanced capabilities of Z.AI's GLM coding models instead of, or in addition to, other models.
Not ideal if you do not use GitHub Copilot Chat, are not a software developer, or do not have access to a Z.AI Coding Plan API key.
Stars
32
Forks
4
Language
Python
License
MIT
Category
Last pushed
Feb 11, 2026
Commits (30d)
0
Dependencies
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/modpotato/copilot-proxy"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
logancyang/obsidian-copilot
THE Copilot in Obsidian
devoxx/DevoxxGenieIDEAPlugin
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All, Jan...
bhaskarblur/NeoBaseAI-Copilot-for-database
Your AI Data Copilot - Interact with your data sources and analyse your data, operations...
rksm/org-ai
Emacs as your personal AI assistant. Use LLMs such as ChatGPT or LLaMA for text generation or...
MatthewZMD/aidermacs
AI Pair Programming in Emacs with Aider