ymcui/PERT
PERT: Pre-training BERT with Permuted Language Model
PERT helps improve the performance of natural language understanding tasks by providing pre-trained language models for both Chinese and English texts. It takes raw text as input and outputs a more semantically rich representation, which can then be used to enhance various text-based applications like sentiment analysis, question answering, or entity recognition. This is useful for data scientists, NLP engineers, or researchers working on building intelligent text-processing systems.
367 stars. No commits in the last 6 months.
Use this if you need to build or improve AI models that understand Chinese or English text, especially for tasks like question answering, text classification, or identifying entities in text.
Not ideal if your primary goal is to correct word order in text, as its performance varies across different tasks, or if you need to integrate it with non-BERT architectures.
Stars
367
Forks
25
Language
—
License
Apache-2.0
Category
Last pushed
Jul 15, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ymcui/PERT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Tongjilibo/bert4torch
An elegent pytorch implement of transformers
nyu-mll/jiant
jiant is an nlp toolkit
lonePatient/TorchBlocks
A PyTorch-based toolkit for natural language processing
monologg/JointBERT
Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
grammarly/gector
Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite"...