xNul/code-llama-for-vscode
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
This project helps developers integrate Code Llama, a powerful code-generating AI model, directly into their Visual Studio Code environment without relying on external services. It takes a local Code Llama model and enables it to work with the Continue VSCode extension. Developers can then use Code Llama for tasks like code completion or generation within their familiar IDE.
569 stars. No commits in the last 6 months.
Use this if you are a developer who wants to use Code Llama directly in VSCode for local code assistance without needing an API key or internet connection.
Not ideal if you prefer using cloud-based AI coding assistants or are not working within the Visual Studio Code environment.
Stars
569
Forks
33
Language
Python
License
MIT
Category
Last pushed
Jul 31, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/xNul/code-llama-for-vscode"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ggml-org/llama.vim
Vim plugin for LLM-assisted code/text completion
ggml-org/llama.vscode
VS Code extension for LLM-assisted code/text completion
DmitryNekrasov/ai-code-completion-idea-plugin
Implementation of IntelliJ IDEA code completion plugin using a local LLM.
10Nates/ollama-autocoder
A simple to use Ollama autocompletion engine with options exposed and streaming functionality
Wells-the-Doctor/leaxer
🌟 Build and deploy local AI models with Leaxer for real-time interaction, streamlined document...