alexiglad/EBT

PyTorch Code for Energy-Based Transformers paper -- generalizable reasoning and scalable learning

55
/ 100
Established

Energy-Based Transformers (EBTs) offer a new way to train AI models that can perform complex reasoning and scale efficiently across various types of data like text or images. It takes raw data, processes it through a sophisticated model, and outputs more accurate and generalizable predictions than traditional methods. This is for AI researchers and machine learning engineers who need to develop highly scalable and robust AI models.

613 stars.

Use this if you are developing large-scale AI models and need a robust, scalable architecture that can handle diverse data types and requires better generalization capabilities than standard transformers.

Not ideal if you are looking for a simple, out-of-the-box solution for a specific problem without needing to deeply engage with model architecture and training scripts.

AI-model-development scalable-machine-learning multimodal-AI deep-learning-research generalizable-AI
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 15 / 25
Community 20 / 25

How are scores calculated?

Stars

613

Forks

85

Language

Python

License

Apache-2.0

Last pushed

Mar 01, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/alexiglad/EBT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.