ayaka14732/bart-base-jax
JAX implementation of the bart-base model
This project offers a foundational JAX implementation of the BART-base language model, primarily for researchers. It takes raw text data, like Wikipedia articles, processes it for training, and produces a trained language model. This is for machine learning researchers or practitioners focused on developing and experimenting with Transformer-based language models.
No commits in the last 6 months.
Use this if you are a machine learning researcher who needs a flexible JAX codebase to study and build new Transformer-based language models, especially on Google Cloud TPUs.
Not ideal if you are looking for an out-of-the-box language model to use for tasks like summarization or translation without deep architectural modifications.
Stars
34
Forks
4
Language
Python
License
—
Category
Last pushed
Apr 11, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/ayaka14732/bart-base-jax"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
farach/huggingfaceR
Hugging Face state-of-the-art models in R
DengBoCong/nlp-paper
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和PyTorch两版本)
xiangking/ark-nlp
A private nlp coding package, which quickly implements the SOTA solutions.
IDEA-CCNL/GTS-Engine
GTS Engine: A powerful NLU Training...
adapter-hub/Hub
ARCHIVED. Please use https://docs.adapterhub.ml/huggingface_hub.html || 🔌 A central repository...