ArtificialZeng/transformers-Explained

官方transformers源码解析。AI大模型时代,pytorch、transformer是新操作系统,其他都是运行在其上面的软件。

27
/ 100
Experimental

This project offers detailed explanations of the Hugging Face Transformers library's source code, focusing on models like LLaMA and Baichuan2. It helps AI/ML engineers and researchers understand the inner workings of large language models, providing insights into model architecture, training, and deployment. The input is the existing Transformers code, and the output is a deeper conceptual understanding for the practitioner.

No commits in the last 6 months.

Use this if you are an AI/ML engineer or researcher who wants to understand the underlying code of large language models powered by the Hugging Face Transformers library.

Not ideal if you are looking for a user-friendly tool to directly apply large language models without delving into their technical implementation.

Large Language Models AI/ML Engineering Deep Learning Research Model Architecture Natural Language Processing
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

16

Forks

1

Language

Python

License

Apache-2.0

Last pushed

Sep 25, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ArtificialZeng/transformers-Explained"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.