kyegomez/MLXTransformer

Simple Implementation of a Transformer in the new framework MLX by Apple

27
/ 100
Experimental

This is a foundational component for machine learning developers working on Apple hardware. It allows you to quickly build and experiment with Transformer models, which are powerful neural network architectures used in various AI applications. You provide data representations (like word embeddings or image patches), and it processes them to produce enhanced representations, enabling tasks from language translation to image recognition. This is for AI/ML engineers and researchers developing new models.

No commits in the last 6 months.

Use this if you are an AI/ML developer looking to implement high-performance Transformer models on Apple's MLX framework for tasks like natural language processing or computer vision.

Not ideal if you are an end-user without programming experience, or if you need a pre-built, ready-to-use application rather than a low-level building block.

AI-development machine-learning-engineering neural-network-architecture model-building Apple-silicon-ML
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

19

Forks

1

Language

Python

License

MIT

Last pushed

Nov 18, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/MLXTransformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.