dbiir/UER-py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
This tool helps machine learning engineers and researchers accelerate their natural language processing (NLP) projects. It allows you to take raw text data, feed it into existing pre-trained language models like BERT, or train new ones, and then adapt these models for specific tasks such as text classification or understanding. The output is a highly effective, specialized model ready for deployment in your NLP applications.
3,106 stars. No commits in the last 6 months.
Use this if you need to quickly build or customize state-of-the-art natural language processing models for tasks like sentiment analysis, question answering, or text generation, leveraging existing powerful pre-trained models.
Not ideal if you are working with multi-modal data (text and images, for example) or need to train extremely large language models with billions of parameters, in which case a newer version like TencentPretrain might be more suitable.
Stars
3,106
Forks
524
Language
Python
License
Apache-2.0
Category
Last pushed
May 09, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/dbiir/UER-py"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related models
Tencent/TencentPretrain
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
cahya-wirawan/indonesian-language-models
Indonesian Language Models and its Usage
XCollab/HuggingFace
This repository provides an overview of Hugging Face's Transformers library, a powerful tool for...
MTxSouza/MediumArticleGenerator
A Language Model (LLM) trained to generate text similar to Medium articles.
shivendrra/enigma
a dna sequence generation/classification using transformers