retarfi/language-pretraining
Pre-training Language Models for Japanese
This project provides pre-trained Japanese language models that can understand and process text. It takes raw Japanese text as input, processes it, and outputs models like BERT or ELECTRA that can be used for various text analysis tasks. It's designed for data scientists and NLP engineers working with Japanese language processing.
No commits in the last 6 months.
Use this if you need to perform advanced natural language processing tasks with Japanese text and want to leverage powerful, pre-trained transformer models.
Not ideal if you are looking for an off-the-shelf application to solve a specific business problem, as this requires technical expertise in NLP and model training.
Stars
50
Forks
6
Language
Python
License
MIT
Category
Last pushed
Jul 02, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/retarfi/language-pretraining"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Tongjilibo/bert4torch
An elegent pytorch implement of transformers
nyu-mll/jiant
jiant is an nlp toolkit
lonePatient/TorchBlocks
A PyTorch-based toolkit for natural language processing
monologg/JointBERT
Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
grammarly/gector
Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite"...