emapco/rk-transformers
Export and Run Hugging Face Transformers Models on Rockchip NPUs
This tool helps developers deploy and run advanced language models, like those from Hugging Face, on Rockchip-powered edge devices with significantly faster performance. It takes a pre-trained language model and converts it into an optimized format, which can then be used on devices like the RK3588 or RK3576 to analyze text quickly. This is for software engineers and machine learning engineers who need to embed AI capabilities into low-power, embedded systems.
Available on PyPI.
Use this if you are developing AI applications for embedded systems and need to accelerate the performance of transformer models on Rockchip NPUs.
Not ideal if you are working with large-scale cloud-based AI deployments or do not use Rockchip hardware for your edge devices.
Stars
24
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Dec 02, 2025
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/emapco/rk-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
Denis2054/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning,...
abhimishra91/transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks