yaohui-wyh/ctoc

Count Tokens of Code (forked from gocloc)

31
/ 100
Emerging

When working with Large Language Models (LLMs), understanding the 'token count' of your code is crucial for managing costs and optimizing how the model processes information. This tool takes your codebase as input and quickly provides a breakdown of how many tokens are in each file and the entire project. It's designed for developers, prompt engineers, or anyone building LLM-powered applications who needs to analyze code's token footprint.

No commits in the last 6 months.

Use this if you need to quickly estimate the token cost of your code when interacting with LLMs or if you're optimizing code for LLM context windows.

Not ideal if you are looking for traditional code metrics like cyclomatic complexity or code quality scores, as it focuses specifically on LLM token counts.

LLM application development prompt engineering AI cost optimization code tokenization API cost management
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

44

Forks

3

Language

Go

License

MIT

Last pushed

Aug 19, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/yaohui-wyh/ctoc"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.