coder/ai-tokenizer
A faster than tiktoken tokenizer with first-class support for Vercel's AI SDK.
This tool helps AI application developers accurately estimate the token count for user inputs and system responses when working with large language models, especially those supported by Vercel's AI SDK. It takes in chat messages, tool definitions, and schemas, and outputs a detailed token count, which is crucial for managing API costs and ensuring model input limits are respected. Developers building AI-powered features benefit most from this, ensuring their applications are cost-efficient and performant.
Use this if you are a developer building AI applications and need a fast, accurate way to pre-calculate token usage for various LLMs, especially within the Vercel AI SDK ecosystem.
Not ideal if you are not a developer or if your primary need is basic text tokenization without specific support for AI SDK message and tool schemas.
Stars
39
Forks
2
Language
TypeScript
License
MIT
Category
Last pushed
Dec 01, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/coder/ai-tokenizer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aiqinxuancai/TiktokenSharp
Token calculation for OpenAI models, using `o200k_base` `cl100k_base` `p50k_base` encoding.
dqbd/tiktokenizer
Online playground for OpenAPI tokenizers
pkoukk/tiktoken-go
go version of tiktoken
microsoft/Tokenizer
Typescript and .NET implementation of BPE tokenizer for OpenAI LLMs.
lenML/tokenizers
a lightweight no-dependency fork from transformers.js (only tokenizers)