valmat/gpt-tokenator
GPT 3 tokens counter
When integrating Large Language Models (LLMs) into applications, knowing the token count of text is crucial for managing API costs and adhering to length limits. This tool takes a piece of text as input and outputs the exact number of tokens it would consume when sent to the OpenAI GPT-3 or GPT-4 API. It's used by software developers building applications that interact with OpenAI's LLMs.
No commits in the last 6 months.
Use this if you are a software developer building applications with GPT-3 or GPT-4 and need to accurately pre-calculate token usage for cost management or input validation.
Not ideal if you are an end-user of an LLM application or a developer working with LLMs other than OpenAI's GPT-3/GPT-4.
Stars
10
Forks
1
Language
C++
License
MIT
Category
Last pushed
Dec 11, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/valmat/gpt-tokenator"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aiqinxuancai/TiktokenSharp
Token calculation for OpenAI models, using `o200k_base` `cl100k_base` `p50k_base` encoding.
pkoukk/tiktoken-go
go version of tiktoken
dqbd/tiktokenizer
Online playground for OpenAPI tokenizers
microsoft/Tokenizer
Typescript and .NET implementation of BPE tokenizer for OpenAI LLMs.
lenML/tokenizers
a lightweight no-dependency fork from transformers.js (only tokenizers)