llama.vscode and ollama-autocoder
Both are VS Code extensions that enable local LLM-based code completion, making them direct competitors offering similar functionality through different underlying inference engines (llama.cpp vs Ollama).
About llama.vscode
ggml-org/llama.vscode
VS Code extension for LLM-assisted code/text completion
This extension helps software developers write code faster by providing intelligent, AI-powered code and text completions directly within their VS Code editor. It takes your existing code and comments as input and offers suggestions to complete lines or generate entire code blocks. This tool is designed for programmers, software engineers, and anyone who regularly writes code or text in VS Code and wants to boost their productivity.
About ollama-autocoder
10Nates/ollama-autocoder
A simple to use Ollama autocompletion engine with options exposed and streaming functionality
This tool helps developers quickly write code and text by suggesting completions as they type. It takes your partially written code or natural language in any text editor and provides relevant suggestions directly in line. Software developers and programmers who spend a lot of time coding will find this useful for speeding up their workflow.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work