uni-token/core
LLM Token Solution for Local AI Agents
This project helps AI agent developers integrate large language model (LLM) tokens into their local AI agents without managing proxy servers or forcing users into complex configurations. It provides a lightweight SDK that allows your users to authorize existing tokens or purchase new ones directly from LLM providers through a simple interface. The end user is any developer creating local AI agents who wants to streamline token access for their users.
Use this if you are developing local AI agents and want an easy, user-friendly way for your users to manage and provide their own LLM API tokens.
Not ideal if you are building an AI agent that runs entirely server-side or if you prefer to bundle LLM token costs directly into your subscription model.
Stars
7
Forks
3
Language
Vue
License
—
Category
Last pushed
Nov 28, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/uni-token/core"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with...
vava-nessa/free-coding-models
Find, benchmark and install in CLI 158 FREE coding LLM models across 20 providers in real time
envoyproxy/ai-gateway
Manages Unified Access to Generative AI Services built on Envoy Gateway
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with...