dqbd/tiktokenizer
Online playground for OpenAPI tokenizers
This tool helps AI application developers quickly determine the exact token count for text prompts intended for OpenAI's large language models. You input any text, and it shows you how many tokens that text consumes according to different OpenAI models. This helps ensure your prompts fit within API limits and manage costs effectively.
1,526 stars. No commits in the last 6 months.
Use this if you are building applications with OpenAI models and need to precisely calculate prompt token counts to stay within budget and API constraints.
Not ideal if you're not working with OpenAI's tokenization or only need a rough estimate of text length.
Stars
1,526
Forks
165
Language
TypeScript
License
MIT
Category
Last pushed
Apr 24, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/dqbd/tiktokenizer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aiqinxuancai/TiktokenSharp
Token calculation for OpenAI models, using `o200k_base` `cl100k_base` `p50k_base` encoding.
pkoukk/tiktoken-go
go version of tiktoken
microsoft/Tokenizer
Typescript and .NET implementation of BPE tokenizer for OpenAI LLMs.
lenML/tokenizers
a lightweight no-dependency fork from transformers.js (only tokenizers)
tryAGI/Tiktoken
This project implements token calculation for OpenAI's gpt-4 and gpt-3.5-turbo model,...