geckse/n8n-nodes-gpt-tokenizer
n8n node for working with BPE Tokens with GPT in mind.
This tool helps automate text processing for large language models within n8n workflows. It takes text as input, converts it into BPE tokens (the units GPT models understand), and outputs token counts, cost estimates, or sliced text segments. AI developers and anyone building automated text workflows with OpenAI's GPT models will find this useful for managing token limits and costs.
No commits in the last 6 months. Available on npm.
Use this if you are building automated workflows with n8n and need to precisely control text input for OpenAI's GPT models, ensuring you stay within token limits and manage costs.
Not ideal if you are not using n8n for workflow automation or if your primary need isn't related to managing BPE tokens for GPT models.
Stars
8
Forks
2
Language
TypeScript
License
MIT
Category
Last pushed
Aug 01, 2023
Commits (30d)
0
Dependencies
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/geckse/n8n-nodes-gpt-tokenizer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aiqinxuancai/TiktokenSharp
Token calculation for OpenAI models, using `o200k_base` `cl100k_base` `p50k_base` encoding.
pkoukk/tiktoken-go
go version of tiktoken
dqbd/tiktokenizer
Online playground for OpenAPI tokenizers
microsoft/Tokenizer
Typescript and .NET implementation of BPE tokenizer for OpenAI LLMs.
lenML/tokenizers
a lightweight no-dependency fork from transformers.js (only tokenizers)