GusLovesMath/Llama3_MacSilicon

Repository for running LLMs efficiently on Mac silicon (M1, M2, M3). Features Jupyter notebook for Meta-Llama-3 setup using MLX framework, with install guide & perf tips. Aims to optimize LLM performance on Mac silicon for devs & researchers.

20
/ 100
Experimental

This project helps you run the powerful Meta-Llama-3 language model directly on your Apple Mac with M-series chips. You input text prompts, and it generates human-like responses, from simple answers to solving math problems. It's designed for researchers and developers who want to experiment with or integrate large language models efficiently on their personal Mac hardware.

No commits in the last 6 months.

Use this if you are a developer or researcher with a Mac M-series computer and want to run the Llama 3 large language model locally for experimentation or application development without needing cloud services.

Not ideal if you don't have a Mac with an M1, M2, or M3 chip, or if you need to deploy large-scale, high-throughput AI applications in a production environment.

large-language-models on-device-AI AI-research ML-development Mac-optimization
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 7 / 25

How are scores calculated?

Stars

11

Forks

1

Language

Jupyter Notebook

License

Last pushed

May 04, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/GusLovesMath/Llama3_MacSilicon"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.