xNul/codestral-mamba-for-vscode
Use Codestral Mamba with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
This project helps software developers who want local AI code assistance within Visual Studio Code. It allows you to use Codestral Mamba, a powerful code generation model, directly on your machine. You provide your code context, and it suggests completions and improvements without sending your data to external services.
No commits in the last 6 months.
Use this if you are a developer using VSCode and want a privacy-focused, offline-capable AI coding assistant powered by Codestral Mamba.
Not ideal if you prefer cloud-based AI code assistants, do not use VSCode, or do not wish to set up local inference environments.
Stars
29
Forks
2
Language
Python
License
MIT
Category
Last pushed
Jul 18, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/xNul/codestral-mamba-for-vscode"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
suhaibbinyounis/github-copilot-api-vscode
Unlock GitHub Copilot as a local API Gateway. Use Copilot with Cursor, LangChain, and any...
Inc44/CoFlu
CoFlu is a powerful text manipulation, generation, and comparison tool. It's designed for tasks...
quack-ai/companion
VSCode coding companion for software teams 🦆 Turn your team insights into a portable...
ErikBjare/are-copilots-local-yet
Are Copilots Local Yet? The frontier of local LLM Copilots for code completion, project...
balisujohn/localpilot
codellama on CPU without Docker