xNul/code-llama-for-vscode

Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.

40
/ 100
Emerging

This project helps developers integrate Code Llama, a powerful code-generating AI model, directly into their Visual Studio Code environment without relying on external services. It takes a local Code Llama model and enables it to work with the Continue VSCode extension. Developers can then use Code Llama for tasks like code completion or generation within their familiar IDE.

569 stars. No commits in the last 6 months.

Use this if you are a developer who wants to use Code Llama directly in VSCode for local code assistance without needing an API key or internet connection.

Not ideal if you prefer using cloud-based AI coding assistants or are not working within the Visual Studio Code environment.

developer-tools code-generation integrated-development-environment local-ai software-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

569

Forks

33

Language

Python

License

MIT

Last pushed

Jul 31, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/xNul/code-llama-for-vscode"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.