amin-tehrani/ollama-colab

Serve Ollama LLMs on Google Colab (free plan) using Ngrok

38
/ 100
Emerging

When building applications with large language models, if you've run out of free API credits on platforms like OpenAI or Anthropic and can't run models locally due to limited computer resources, this project allows you to use free Google Colab resources to host Ollama language models. It takes your Colab environment and makes it accessible as an API, providing the language model outputs for your application.

No commits in the last 6 months.

Use this if you are a developer building an LLM-powered application and need a free, accessible way to host open-source language models without exhausting API credits or local machine resources.

Not ideal if you already have sufficient local computing power to host large language models or have a budget for commercial LLM API services.

LLM-development application-prototyping API-hosting cloud-resource-leveraging
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

26

Forks

5

Language

Jupyter Notebook

License

MIT

Last pushed

May 18, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/amin-tehrani/ollama-colab"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.