coder/ai-tokenizer

A faster than tiktoken tokenizer with first-class support for Vercel's AI SDK.

34
/ 100
Emerging

This tool helps AI application developers accurately estimate the token count for user inputs and system responses when working with large language models, especially those supported by Vercel's AI SDK. It takes in chat messages, tool definitions, and schemas, and outputs a detailed token count, which is crucial for managing API costs and ensuring model input limits are respected. Developers building AI-powered features benefit most from this, ensuring their applications are cost-efficient and performant.

Use this if you are a developer building AI applications and need a fast, accurate way to pre-calculate token usage for various LLMs, especially within the Vercel AI SDK ecosystem.

Not ideal if you are not a developer or if your primary need is basic text tokenization without specific support for AI SDK message and tool schemas.

AI development LLM cost optimization prompt engineering AI application performance token management
No Package No Dependents
Maintenance 6 / 25
Adoption 7 / 25
Maturity 15 / 25
Community 6 / 25

How are scores calculated?

Stars

39

Forks

2

Language

TypeScript

License

MIT

Last pushed

Dec 01, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/coder/ai-tokenizer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.