UER-py and TencentPretrain

These are competitors offering similar pre-training frameworks for NLP models, with UER-py being the more popular fork/alternative that provides comparable functionality for training and fine-tuning transformer-based models in PyTorch.

UER-py
49
Emerging
TencentPretrain
48
Emerging
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 23/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 3,106
Forks: 524
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 1,090
Forks: 148
Downloads:
Commits (30d): 0
Language: Python
License:
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About UER-py

dbiir/UER-py

Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo

This tool helps machine learning engineers and researchers accelerate their natural language processing (NLP) projects. It allows you to take raw text data, feed it into existing pre-trained language models like BERT, or train new ones, and then adapt these models for specific tasks such as text classification or understanding. The output is a highly effective, specialized model ready for deployment in your NLP applications.

natural-language-processing machine-learning-engineering text-classification language-model-training sentiment-analysis

About TencentPretrain

Tencent/TencentPretrain

Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo

This project helps AI engineers and researchers build custom AI models by taking raw data like text, images, or audio and specialized model configurations. It outputs pre-trained and fine-tuned models ready for specific tasks such as sentiment analysis or machine reading comprehension. It's designed for machine learning practitioners working on advanced AI applications.

natural-language-processing computer-vision speech-recognition deep-learning ai-model-development

Scores updated daily from GitHub, PyPI, and npm data. How scores work