modpotato/copilot-proxy

copilot proxy for glm coding plan, will support more in the future (maybe?)

52
/ 100
Established

This tool allows developers to use various GLM coding models directly within GitHub Copilot Chat in VS Code. It acts as a bridge, letting Copilot Chat (which normally expects the Ollama API) communicate with a Z.AI Coding Plan backend. The input is your natural language coding requests in Copilot Chat, and the output is code suggestions and completions generated by powerful GLM models.

Available on PyPI.

Use this if you are a developer using GitHub Copilot Chat and want to leverage the advanced capabilities of Z.AI's GLM coding models instead of, or in addition to, other models.

Not ideal if you do not use GitHub Copilot Chat, are not a software developer, or do not have access to a Z.AI Coding Plan API key.

AI-assisted coding software development developer tools VS Code extensions large language models
Maintenance 10 / 25
Adoption 7 / 25
Maturity 24 / 25
Community 11 / 25

How are scores calculated?

Stars

32

Forks

4

Language

Python

License

MIT

Last pushed

Feb 11, 2026

Commits (30d)

0

Dependencies

4

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/modpotato/copilot-proxy"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.