lucianoayres/nino-cli
Nino is a CLI tool that interacts with local language models via Ollama's serve mode, enabling real-time responses and easy output saving directly from the terminal.
This tool helps you quickly get answers, summaries, or insights from large language models running directly on your computer, without sending your data to the cloud. You provide a question, some text, an image, or even live data from other command-line tools, and it gives you a real-time response that you can read in your terminal or save to a file. Anyone who wants to leverage AI for data analysis, content generation, or quick information retrieval, while keeping their data private, would find this useful.
No commits in the last 6 months.
Use this if you need to interact with AI models for tasks like text analysis, content generation, or image description directly from your terminal, without relying on cloud services and keeping all your data local.
Not ideal if you prefer graphical interfaces for AI interactions or if you don't want to manage local AI model installations.
Stars
7
Forks
1
Language
Go
License
MIT
Category
Last pushed
Nov 09, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/lucianoayres/nino-cli"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
cel-ai/celai
Open source framework designed to accelerate the development of omnichannel AI virtual assistants.
sauravpanda/BrowserAI
Run local LLMs like llama, deepseek-distill, kokoro and more inside your browser
lone-cloud/gerbil
A desktop app for running Large Language Models locally.
vinjn/llm-metahuman
An open solution for AI-powered photorealistic digital humans.
cztomsik/ava
All-in-one desktop app for running LLMs locally.