ztjhz/t5-jax
JAX implementation of the T5 model: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
This is an enhanced implementation of the T5 model, a powerful AI model for text-to-text tasks like summarization, translation, or question answering. It processes input text to generate relevant output text, making it useful for AI researchers and engineers who are building and experimenting with large language models. The project aims to provide a clearer and more performant T5 codebase, especially when working with Google Cloud TPUs.
No commits in the last 6 months.
Use this if you are an AI researcher or engineer working with Transformer-based language models and need a high-performance, clear, and educational T5 implementation, particularly for Google Cloud TPUs.
Not ideal if you are a practitioner looking for a ready-to-use application of T5 without diving into its underlying architecture and technical optimizations.
Stars
24
Forks
—
Language
Python
License
MIT
Category
Last pushed
Jun 10, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/ztjhz/t5-jax"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
farach/huggingfaceR
Hugging Face state-of-the-art models in R
DengBoCong/nlp-paper
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和PyTorch两版本)
xiangking/ark-nlp
A private nlp coding package, which quickly implements the SOTA solutions.
IDEA-CCNL/GTS-Engine
GTS Engine: A powerful NLU Training...
adapter-hub/Hub
ARCHIVED. Please use https://docs.adapterhub.ml/huggingface_hub.html || 🔌 A central repository...