ayaka14732/bart-base-jax

JAX implementation of the bart-base model

26
/ 100
Experimental

This project offers a foundational JAX implementation of the BART-base language model, primarily for researchers. It takes raw text data, like Wikipedia articles, processes it for training, and produces a trained language model. This is for machine learning researchers or practitioners focused on developing and experimenting with Transformer-based language models.

No commits in the last 6 months.

Use this if you are a machine learning researcher who needs a flexible JAX codebase to study and build new Transformer-based language models, especially on Google Cloud TPUs.

Not ideal if you are looking for an out-of-the-box language model to use for tasks like summarization or translation without deep architectural modifications.

language-model-research natural-language-processing deep-learning-research model-architecture-experimentation large-language-models
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 11 / 25

How are scores calculated?

Stars

34

Forks

4

Language

Python

License

Last pushed

Apr 11, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/ayaka14732/bart-base-jax"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.