alisafaya/Arabic-BERT
Arabic edition of BERT pretrained language models
This project provides pre-trained language models specifically designed for understanding Arabic text. It takes raw Arabic text as input and produces a deep understanding of its meaning, including nuances from various dialects. Data scientists and NLP engineers working on Arabic-language applications would use this for tasks like sentiment analysis, text classification, or identifying named entities.
133 stars. No commits in the last 6 months.
Use this if you are building an application that needs to process and understand Arabic text with high accuracy, especially if your data includes different dialects.
Not ideal if your primary focus is on languages other than Arabic, as these models are specifically trained for Arabic text.
Stars
133
Forks
20
Language
—
License
MIT
Category
Last pushed
Dec 05, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/alisafaya/Arabic-BERT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
JayYip/m3tl
BERT for Multitask Learning
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.