octanove/shiba
Pytorch implementation and pre-trained Japanese model for CANINE, the efficient character-level transformer.
This project helps natural language processing practitioners analyze and classify Japanese text without worrying about vocabulary limitations or handling very long documents. It takes raw Japanese character input and outputs powerful text embeddings or classifications, enabling tasks like news article categorization or word segmentation. It is designed for data scientists and NLP engineers working with diverse or large Japanese text datasets.
No commits in the last 6 months.
Use this if you need an efficient character-level Japanese language model that handles unseen words and long text sequences effectively.
Not ideal if your primary task is highly accurate, dictionary-based Japanese word segmentation where tools like MeCab typically excel.
Stars
89
Forks
14
Language
Python
License
—
Category
Last pushed
Nov 03, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/octanove/shiba"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in...
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
pbloem/former
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
kyegomez/SimplifiedTransformers
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections,...