megagonlabs/t5-japanese
Codes to pre-train Japanese T5 models
This project offers pre-trained language models for working with Japanese text. It takes a vast amount of Japanese web text as input to produce sophisticated models that can understand and generate Japanese language. Data scientists and machine learning engineers who need to build advanced natural language processing applications for Japanese content would use this.
No commits in the last 6 months.
Use this if you are a machine learning engineer or data scientist developing applications that require understanding or generating Japanese text.
Not ideal if you are looking for an out-of-the-box application rather than a foundational model for further development.
Stars
40
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Sep 07, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/megagonlabs/t5-japanese"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...