yaohui-wyh/ctoc
Count Tokens of Code (forked from gocloc)
When working with Large Language Models (LLMs), understanding the 'token count' of your code is crucial for managing costs and optimizing how the model processes information. This tool takes your codebase as input and quickly provides a breakdown of how many tokens are in each file and the entire project. It's designed for developers, prompt engineers, or anyone building LLM-powered applications who needs to analyze code's token footprint.
No commits in the last 6 months.
Use this if you need to quickly estimate the token cost of your code when interacting with LLMs or if you're optimizing code for LLM context windows.
Not ideal if you are looking for traditional code metrics like cyclomatic complexity or code quality scores, as it focuses specifically on LLM token counts.
Stars
44
Forks
3
Language
Go
License
MIT
Category
Last pushed
Aug 19, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/yaohui-wyh/ctoc"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lguibr/CodeConcat
Prepare your code for AI. Concatenates your project into a clean, markdown-formatted file with...
uxname/kodu
High-performance CLI to prepare codebase for LLMs, automate reviews, and draft commits.
Water-Run/pack-my-code
pmc(pack-my-code): A tiny binary tool that “packages” your project code, suitable for sending to...
Crystainexhaustible329/pack-my-code
Package code context into clean markdown for LLM prompts using a minimalist, lightweight tool...
s1korrrr/codebase-combiner
Combine a workspace or folder into a single Markdown/text file with filters and token estimates.