joshualamerton/prompt-trace
Prompt and response tracing for LLM workflows
35
/ 100
Emerging
No Package
No Dependents
Maintenance
10 / 25
Adoption
2 / 25
Maturity
11 / 25
Community
12 / 25
Stars
2
Forks
1
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/joshualamerton/prompt-trace"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
genieincodebottle/schemalock
LLM output contract testing CLI, define what your pipeline must return, test it against any...
43
antsanchez/prompto
Interact with various LLMs in your browser (LangChain.js, Angular)
39
Coolhand-Labs/coolhand-ruby
Zero-config LLM cost & quality monitoring for Ruby apps - automatically log AI API calls and...
38
suhjohn/llm-workbench
UI for testing prompts across various datasets locally
35
atjsh/llmlingua-2-js
JavaScript/TypeScript implementation of LLMLingua-2 (Experimental)
34