jpmanson/llm_templates
Instruction/chat prompts creation library for text generation LLMs. It supports local and Hugging Face models.
This tool helps developers correctly format chat conversations for large language models (LLMs) like Llama, Mistral, and Gemma. You provide a list of messages, each with a role (user, assistant) and content, and it outputs a single string formatted precisely as the LLM expects. This ensures your LLM applications perform as intended, preventing common errors due to incorrect input formatting.
No commits in the last 6 months. Available on PyPI.
Use this if you are a developer building applications with various LLMs and need a reliable way to format chat prompts to match each model's specific training format.
Not ideal if you are an end-user looking for a no-code solution to interact with LLMs, or if you only work with a single LLM and already handle its specific prompt formatting manually.
Stars
33
Forks
1
Language
Python
License
MIT
Category
Last pushed
May 21, 2025
Commits (30d)
0
Dependencies
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/jpmanson/llm_templates"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
posit-dev/chatlas
Your friendly guide to building LLM chat apps in Python with less effort and more clarity.
xming521/WeClone
🚀 One-stop solution for creating your AI twin from chat history 💡 Fine-tune LLMs with your chat...
ooyinet/WeClone
🚀从聊天记录创造数字分身的一站式解决方案💡 使用聊天记录微调大语言模型,让大模型有“那味儿”,并绑定到聊天机器人,实现自己的数字分身。 数字克隆/数字分身/数字永生/LLM/聊天机器人/LoRA
vemonet/libre-chat
🦙 Free and Open Source Large Language Model (LLM) chatbot web UI and API. Self-hosted, offline...
qqqqqf-q/MirrorFlow
从对话数据到训练:数字分身 + 模型蒸馏 From Dialogue Data to Training Closed-Loop: Digital Twin + Model Distillation