yfedoseev/llmkit
Production-grade LLM client - Rust, Python, TypeScript. 100+ providers, 11,000+ models.
This is a client for developers building applications that use Large Language Models (LLMs). It allows you to connect to over 100 different LLM providers and 11,000+ models through a single, unified API. Developers can use it to integrate advanced LLM features like prompt caching, tool calling, and streaming into their applications reliably and efficiently.
Use this if you are a software developer building production-grade applications that leverage various LLMs and need a robust, high-performance, and feature-rich client to manage your interactions.
Not ideal if you are an end-user looking for a no-code solution or a non-developer seeking a direct way to use LLMs without programming.
Stars
12
Forks
—
Language
Rust
License
Apache-2.0
Category
Last pushed
Mar 02, 2026
Monthly downloads
28
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yfedoseev/llmkit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.