amin-tehrani/ollama-colab
Serve Ollama LLMs on Google Colab (free plan) using Ngrok
When building applications with large language models, if you've run out of free API credits on platforms like OpenAI or Anthropic and can't run models locally due to limited computer resources, this project allows you to use free Google Colab resources to host Ollama language models. It takes your Colab environment and makes it accessible as an API, providing the language model outputs for your application.
No commits in the last 6 months.
Use this if you are a developer building an LLM-powered application and need a free, accessible way to host open-source language models without exhausting API credits or local machine resources.
Not ideal if you already have sufficient local computing power to host large language models or have a budget for commercial LLM API services.
Stars
26
Forks
5
Language
Jupyter Notebook
License
MIT
Category
Last pushed
May 18, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/amin-tehrani/ollama-colab"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.