tensorops/TransformerX
Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)
This is a Python library that helps machine learning researchers and engineers build advanced transformer-based models for tasks like language translation or text summarization. It provides ready-to-use components (layers) that can be combined to create complex neural network architectures. The input is typically raw text or other sequential data, and the output is a trained transformer model ready for deployment or further research. This is for machine learning practitioners focusing on natural language processing and sequence-to-sequence problems.
No commits in the last 6 months. Available on PyPI.
Use this if you are a machine learning researcher or engineer building custom transformer models in TensorFlow and want to accelerate your development with pre-built, flexible components.
Not ideal if you need a complete, end-to-end solution for training and data loading, as those modules are still under active development.
Stars
53
Forks
8
Language
Python
License
MIT
Category
Last pushed
Jan 27, 2024
Commits (30d)
0
Dependencies
5
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/tensorops/TransformerX"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action