belladoreai/llama3-tokenizer-js

JS tokenizer for LLaMA 3 and LLaMA 3.1

46
/ 100
Emerging

This tool helps developers accurately count tokens in text when working with LLaMA 3 or LLaMA 3.1 language models. It takes plain text as input and outputs the corresponding token count, ensuring that applications using these models can precisely manage text length for features like input limits or cost estimation. It's designed for developers building web or Node.js applications that interact with LLaMA 3 models.

117 stars. No commits in the last 6 months. Available on npm.

Use this if you are a web or Node.js developer building an application that needs to accurately count tokens for LLaMA 3 or LLaMA 3.1 models directly in the browser or server-side JavaScript.

Not ideal if you are working with LLaMA 1, LLaMA 2, OpenAI, or Mistral models, or if you need to train a tokenizer.

Large Language Models LLM application development web development Node.js development tokenization
Stale 6m No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 9 / 25

How are scores calculated?

Stars

117

Forks

6

Language

JavaScript

License

MIT

Last pushed

Jul 28, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/belladoreai/llama3-tokenizer-js"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.