lucianoayres/nino-cli

Nino is a CLI tool that interacts with local language models via Ollama's serve mode, enabling real-time responses and easy output saving directly from the terminal.

29
/ 100
Experimental

This tool helps you quickly get answers, summaries, or insights from large language models running directly on your computer, without sending your data to the cloud. You provide a question, some text, an image, or even live data from other command-line tools, and it gives you a real-time response that you can read in your terminal or save to a file. Anyone who wants to leverage AI for data analysis, content generation, or quick information retrieval, while keeping their data private, would find this useful.

No commits in the last 6 months.

Use this if you need to interact with AI models for tasks like text analysis, content generation, or image description directly from your terminal, without relying on cloud services and keeping all your data local.

Not ideal if you prefer graphical interfaces for AI interactions or if you don't want to manage local AI model installations.

data-analysis local-AI text-generation image-analysis command-line-automation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 9 / 25

How are scores calculated?

Stars

7

Forks

1

Language

Go

License

MIT

Last pushed

Nov 09, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/lucianoayres/nino-cli"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.