chengchingwen/Transformers.jl

Julia Implementation of Transformer models

48
/ 100
Emerging

This is a Julia implementation of Transformer-based models, which are used to process sequences of data, especially text. It takes raw text and transforms it into numerical representations that can be used for various natural language processing tasks. This is for machine learning engineers and researchers who are building advanced AI models for language understanding.

568 stars. No commits in the last 6 months.

Use this if you are developing or experimenting with large language models in Julia and need to implement or fine-tune transformer architectures.

Not ideal if you are looking for an out-of-the-box solution for text analysis without needing to build or customize machine learning models.

natural-language-processing machine-learning-engineering deep-learning-research computational-linguistics ai-model-development
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

568

Forks

81

Language

Julia

License

MIT

Last pushed

Aug 02, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/chengchingwen/Transformers.jl"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.