lvyufeng/cybertron-ai
mindspore implementation of transformers
Cybertron provides a framework for researchers and developers working with MindSpore to implement and experiment with advanced transformer models. It takes pre-trained model names and text inputs, producing model outputs for various natural language processing tasks. This is for AI researchers and developers who are building or studying large language models using the MindSpore framework.
No commits in the last 6 months.
Use this if you are a researcher or developer using MindSpore and need a flexible framework to implement, load, and experiment with transformer models, including those compatible with Hugging Face.
Not ideal if you are not working with MindSpore or if you need a solution for deploying existing transformer models without deep customization or research.
Stars
68
Forks
11
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 30, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/lvyufeng/cybertron-ai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
Denis2054/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning,...
abhimishra91/transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks