yaph/charla
A terminal based chat application that works with AI language models.
Charla is a tool for developers who want to interact with AI language models directly from their terminal. It allows you to feed in text prompts and receive AI-generated responses, supporting both local models via Ollama and cloud-based models like GitHub Models. This is ideal for developers, data scientists, or AI researchers who prefer a command-line interface for their AI interactions.
Available on PyPI.
Use this if you are a developer who wants a fast, text-based way to experiment with or integrate AI language models without leaving your terminal.
Not ideal if you prefer a graphical user interface for interacting with AI, or if you need advanced features like image generation or complex data analysis.
Stars
12
Forks
—
Language
Python
License
MIT
Category
Last pushed
Dec 03, 2025
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yaph/charla"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
JHubi1/ollama-app
A modern and easy-to-use client for Ollama
serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.