deadbits/prompt-serve

Store and serve language model prompts

19
/ 100
Experimental

This helps professionals who design and use large language models (LLMs) to organize and reuse their prompts effectively. You input your LLM prompts, along with their settings and metadata, into a structured YAML format. The output is a categorized, version-controlled collection of prompts that can be easily retrieved or converted for use in LLM applications. This is for AI practitioners, data scientists, and ML engineers working with LLMs.

No commits in the last 6 months.

Use this if you need a systematic way to manage, version, and reuse your large language model prompts and their associated configurations across various projects or team members.

Not ideal if you only use a few simple prompts occasionally or prefer to manage them directly within your code without external tooling.

LLM-prompt-management AI-workflow natural-language-processing prompt-engineering MLOps
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 4 / 25

How are scores calculated?

Stars

29

Forks

1

Language

Python

License

Last pushed

Jul 26, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/deadbits/prompt-serve"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.