ggml-org/llama.vim
Vim plugin for LLM-assisted code/text completion
This project helps software developers using Vim or Neovim get real-time, AI-powered code and text suggestions directly within their editor. It takes your current code or text as input and provides intelligent completions or edits, helping you write faster and more efficiently. This tool is for programmers, scripters, and anyone who spends significant time writing code or technical text in a Vim-based environment.
1,913 stars. Actively maintained with 3 commits in the last 30 days.
Use this if you are a developer who uses Vim or Neovim extensively and wants to leverage local AI models for faster code completion and editing without relying on cloud-based services.
Not ideal if you use a different code editor like VS Code or Emacs, or if you prefer not to run large language models locally on your machine.
Stars
1,913
Forks
95
Language
Vim Script
License
MIT
Category
Last pushed
Jan 31, 2026
Commits (30d)
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ggml-org/llama.vim"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related models
ggml-org/llama.vscode
VS Code extension for LLM-assisted code/text completion
DmitryNekrasov/ai-code-completion-idea-plugin
Implementation of IntelliJ IDEA code completion plugin using a local LLM.
10Nates/ollama-autocoder
A simple to use Ollama autocompletion engine with options exposed and streaming functionality
xNul/code-llama-for-vscode
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to...
Wells-the-Doctor/leaxer
🌟 Build and deploy local AI models with Leaxer for real-time interaction, streamlined document...