retarfi/language-pretraining

Pre-training Language Models for Japanese

36
/ 100
Emerging

This project provides pre-trained Japanese language models that can understand and process text. It takes raw Japanese text as input, processes it, and outputs models like BERT or ELECTRA that can be used for various text analysis tasks. It's designed for data scientists and NLP engineers working with Japanese language processing.

No commits in the last 6 months.

Use this if you need to perform advanced natural language processing tasks with Japanese text and want to leverage powerful, pre-trained transformer models.

Not ideal if you are looking for an off-the-shelf application to solve a specific business problem, as this requires technical expertise in NLP and model training.

Japanese NLP text analysis language models data science machine learning engineering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 12 / 25

How are scores calculated?

Stars

50

Forks

6

Language

Python

License

MIT

Last pushed

Jul 02, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/retarfi/language-pretraining"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.