nlpaueb/greek-bert
A Greek edition of BERT pre-trained language model
This project offers a specialized version of the BERT language model, specifically designed for the Greek language. It takes Greek text as input and can predict missing words or understand the context of sentences, producing more accurate language processing results for Greek. This is useful for researchers and developers building applications that need to understand or generate Greek text.
148 stars. No commits in the last 6 months.
Use this if you are developing software or research applications that require advanced natural language understanding or generation for the Greek language.
Not ideal if you are working with languages other than Greek, as this model is specifically trained for Greek text.
Stars
148
Forks
12
Language
Python
License
MIT
Category
Last pushed
Jul 25, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/nlpaueb/greek-bert"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Tongjilibo/bert4torch
An elegent pytorch implement of transformers
nyu-mll/jiant
jiant is an nlp toolkit
lonePatient/TorchBlocks
A PyTorch-based toolkit for natural language processing
monologg/JointBERT
Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
grammarly/gector
Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite"...