zerob13/modelinfo-cli
A CLI to query AI model capabilities, context limits, and pricing from PublicProviderConf.
This tool helps AI developers quickly find detailed information about various AI models. It allows you to query model capabilities, pricing, and context limits directly from your command line. Developers use this to get up-to-date model specs for their projects.
Use this if you are an AI developer who needs fast, offline-capable access to AI model metadata, including pricing, capabilities, and token limits.
Not ideal if you prefer a graphical user interface for browsing AI model information or are not comfortable using a command-line tool.
Stars
8
Forks
—
Language
TypeScript
License
MIT
Category
Last pushed
Mar 08, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/zerob13/modelinfo-cli"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.