modzy/hugging-face-raspberry-pi
Deploy, serve, and run a Hugging Face model on a Raspberry Pi with just a few lines of code
This project helps data scientists, machine learning engineers, and IoT developers deploy Hugging Face models onto low-power edge devices like the Raspberry Pi. It takes a pre-trained Hugging Face model and converts it into a Docker container, which can then be served as a microservice on a Raspberry Pi. The output is a running AI model on a small, local device, ready for integration into edge computing applications.
No commits in the last 6 months.
Use this if you need to run AI models directly on embedded systems, IoT devices, or other resource-constrained hardware to perform inference locally.
Not ideal if you plan to run your models exclusively on powerful cloud servers or if you are not comfortable with Docker and basic Linux command-line operations.
Stars
60
Forks
7
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Dec 05, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/modzy/hugging-face-raspberry-pi"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/huggingface_hub
The official Python client for the Hugging Face Hub.
huggingface/hub-docs
Docs of the Hugging Face Hub
huggingface/huggingface.js
Use Hugging Face with JavaScript
dice-group/Ontolearn
OWL Class Expressions Learning in Python
KRR-Oxford/DeepOnto
A package for ontology engineering with deep learning and language models.