richarddwang/electra_pytorch
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)
This project offers a verified method for pretraining and fine-tuning text encoders, specifically the ELECTRA model, to process and understand natural language. It takes raw text data as input and produces a trained language model that can then be applied to various text classification and understanding tasks. Language model developers, machine learning researchers, and data scientists working on advanced natural language processing problems would find this useful.
331 stars. No commits in the last 6 months.
Use this if you need to pretrain a high-performance text encoder from scratch or fine-tune an existing one for specific natural language understanding benchmarks.
Not ideal if you are looking for a simple, out-of-the-box solution for common NLP tasks without needing to delve into model pretraining or advanced configuration.
Stars
331
Forks
43
Language
Python
License
—
Category
Last pushed
Jan 10, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/richarddwang/electra_pytorch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
JayYip/m3tl
BERT for Multitask Learning
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.