kyegomez/MLXTransformer
Simple Implementation of a Transformer in the new framework MLX by Apple
This is a foundational component for machine learning developers working on Apple hardware. It allows you to quickly build and experiment with Transformer models, which are powerful neural network architectures used in various AI applications. You provide data representations (like word embeddings or image patches), and it processes them to produce enhanced representations, enabling tasks from language translation to image recognition. This is for AI/ML engineers and researchers developing new models.
No commits in the last 6 months.
Use this if you are an AI/ML developer looking to implement high-performance Transformer models on Apple's MLX framework for tasks like natural language processing or computer vision.
Not ideal if you are an end-user without programming experience, or if you need a pre-built, ready-to-use application rather than a low-level building block.
Stars
19
Forks
1
Language
Python
License
MIT
Category
Last pushed
Nov 18, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/MLXTransformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in...
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
pbloem/former
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
kyegomez/SimplifiedTransformers
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections,...