STAR-173/LLMSession-Docker
A unified REST API that wraps web-based LLM sessions (ChatGPT, Claude, Google AI Studio) into a standard interface using headless automation. Access your web-tier subscriptions programmatically without per-token API costs.
This project helps you automate interactions with popular web-based AI chatbots like ChatGPT, Claude, and Google AI Studio. You provide text prompts, and the system sends them to your chosen AI, returning the AI's response as if you typed it yourself. This is ideal for anyone who regularly uses these services for tasks and wants to integrate them into automated workflows without paying per-token API costs.
Use this if you want to programmatically send prompts to web-based AI chatbots and get their responses, leveraging your existing free or paid subscriptions without incurring additional API charges.
Not ideal if you need enterprise-grade security for public-facing applications or require high-volume, highly concurrent AI interactions without any rate-limiting considerations.
Stars
16
Forks
2
Language
Python
License
MIT
Category
Last pushed
Dec 01, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/STAR-173/LLMSession-Docker"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.