brendancopley/mcp-chain-of-draft-prompt-tool
MCP prompt tool applying Chain-of-Draft (CoD) reasoning - BYOLLM
This tool helps anyone working with Large Language Models (LLMs) to get better and more efficient results from their prompts. It takes your regular prompt and automatically transforms it into a structured 'Chain of Draft' or 'Chain of Thought' format. The LLM then processes this enhanced prompt, and the tool converts the complex LLM response back into a clear, concise answer. This significantly improves the reasoning quality of your LLM outputs while reducing operational costs and speeding up response times.
No commits in the last 6 months.
Use this if you need to consistently get more accurate, faster, and cost-effective answers from your chosen LLM, especially for complex reasoning tasks.
Not ideal if you only use LLMs for very simple, single-turn prompts where basic responses are sufficient and efficiency isn't a primary concern.
Stars
18
Forks
3
Language
Python
License
—
Category
Last pushed
Sep 08, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/brendancopley/mcp-chain-of-draft-prompt-tool"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jddunn/tenets
Local-first MCP server for intelligent context that feeds your prompts
Ejb503/systemprompt-mcp-core
The core MCP extension for Systemprompt MCP multimodal client
langfuse/mcp-server-langfuse
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to...
n0zer0d4y/mercury-spec-ops
Modular MCP server for programmatic prompt engineering. Provides intelligent prompt assembly for...
systempromptio/systemprompt-mcp-server
A complete, production-ready implementation of a Model Context Protocol (MCP) server...