Pyenb/Ollama-models
A collection of zipped Ollama models for offline use. Simply download, extract, and set up your desired model anywhere.
This project provides pre-packaged Ollama language models for use on computers without an internet connection. You download the compressed model files, extract them, and place them in the correct directory. This is for system administrators or users who need to run large language models in offline or air-gapped environments.
Use this if you need to run Ollama-compatible AI models on systems that do not have continuous internet access for downloading.
Not ideal if your computer has a stable internet connection and you prefer to download models directly through the Ollama client.
Stars
88
Forks
7
Language
Shell
License
GPL-3.0
Category
Last pushed
Nov 02, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Pyenb/Ollama-models"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
jaypatel15406/Ollama-Adaptive-Image-Code-Gen
Ollama Adaptive Image Code Gen is an asynchronous Python application that uses LLMs to...
Marvin-VW/python-ollama-local
This Python script enables hands-free interaction with a local Llama2 language model. It...
cognisoc/mullama
Drop-in Ollama replacement. All-in-one local LLM toolkit.
chaoluond/safetyllama
Finetune LLaMA-2-7b-chat to perform safety evaluation of user-bot conversation
rajkundalia/error-analyzer-with-baml
Analyze Java compilation and runtime errors using BAML with a local Ollama model. This project...