NVIDIA/sphinx-llm
LLM extensions for Sphinx Documentation
This project helps documentation teams make their Sphinx-based content easily digestible for Large Language Models (LLMs) and AI agents. It takes your existing Sphinx documentation as input and generates standardized Markdown files, making your project's information accessible for AI systems to consume and respond to user queries. Technical writers, content strategists, and developer relations teams are the primary users.
16 stars and 16,307 monthly downloads. Available on PyPI.
Use this if you want to ensure your Sphinx documentation is well-indexed by LLMs and AI agents, or if you need to dynamically generate content like page summaries within your documentation using AI.
Not ideal if you're looking for an interactive chatbot directly embedded within your documentation, as its focus is on static content generation and AI consumption.
Stars
16
Forks
4
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 27, 2026
Monthly downloads
16,307
Commits (30d)
0
Dependencies
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/NVIDIA/sphinx-llm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related models
tensorchord/modelz-llm
OpenAI compatible API for LLMs and embeddings (LLaMA, Vicuna, ChatGLM and many others)
synacktraa/tool-parse
Making LLM Tool-Calling Simpler.
gusye1234/llm-as-function
Embed your LLM into a python function
caua1503/llm-tool-fusion
llm-tool-fusion é uma biblioteca Python que unifica e simplifica o uso de ferramentas com LLMs....
murphyhoucn/llm-dev
LLM Dev