quantalogic/qllm
QLLM: A powerful CLI for seamless interaction with multiple Large Language Models. Simplify AI workflows, streamline development, and unlock the potential of cutting-edge language models. ⭐ If you find QLLM useful, consider giving us a star on GitHub! It helps us reach more developers and improve the tool. ⭐
QLLM is a command-line tool for developers and data analysts that streamlines interaction with various Large Language Models. It allows users to input text prompts or data, such as CSV files, directly into their terminal and receive AI-generated summaries, reports, creative content, or analyses. This tool helps integrate AI capabilities seamlessly into existing command-line workflows.
No commits in the last 6 months. Available on npm.
Use this if you are a developer or data analyst who frequently uses the terminal and wants to quickly access and leverage multiple Large Language Models for tasks like content generation, data analysis, or summarization without switching tools.
Not ideal if you prefer graphical user interfaces for AI interactions or if you are not comfortable working within a command-line environment.
Stars
35
Forks
9
Language
TypeScript
License
—
Category
Last pushed
Apr 11, 2025
Monthly downloads
119
Commits (30d)
0
Dependencies
13
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/quantalogic/qllm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
containers/ramalama
RamaLama is an open-source developer tool that simplifies the local serving of AI models from...
av/harbor
One command brings a complete pre-wired LLM stack with hundreds of services to explore.
RunanywhereAI/runanywhere-sdks
Production ready toolkit to run AI locally
runpod-workers/worker-vllm
The RunPod worker template for serving our large language model endpoints. Powered by vLLM.
foldl/chatllm.cpp
Pure C++ implementation of several models for real-time chatting on your computer (CPU & GPU)