ParthaPRay/Ollama_GoogleColab_colabxterm_langchain

This repo contains a code that uses colabxterm and langchain community packages to install Ollama on Google Colab free tier T4 and pulls a model from Ollama and chats with it

27
/ 100
Experimental

This project helps AI/ML practitioners and researchers quickly set up and interact with open-source large language models (LLMs) like Llama 3 directly within a Google Colab environment. You input commands in a virtual terminal within Colab to install Ollama and pull an LLM, then use standard Python to chat with it. This is designed for those experimenting with or prototyping LLM applications.

No commits in the last 6 months.

Use this if you want to test or develop with Ollama and its compatible models (like Llama 3) without needing local powerful hardware, leveraging Google Colab's free T4 GPU.

Not ideal if you need a persistent, production-ready LLM deployment or if you prefer a fully local setup outside of a cloud notebook environment.

AI/ML Experimentation LLM Prototyping Cloud Computing Machine Learning Research Model Evaluation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

13

Forks

1

Language

Jupyter Notebook

License

Apache-2.0

Last pushed

May 09, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ParthaPRay/Ollama_GoogleColab_colabxterm_langchain"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.