alexiglad/EBT
PyTorch Code for Energy-Based Transformers paper -- generalizable reasoning and scalable learning
Energy-Based Transformers (EBTs) offer a new way to train AI models that can perform complex reasoning and scale efficiently across various types of data like text or images. It takes raw data, processes it through a sophisticated model, and outputs more accurate and generalizable predictions than traditional methods. This is for AI researchers and machine learning engineers who need to develop highly scalable and robust AI models.
613 stars.
Use this if you are developing large-scale AI models and need a robust, scalable architecture that can handle diverse data types and requires better generalization capabilities than standard transformers.
Not ideal if you are looking for a simple, out-of-the-box solution for a specific problem without needing to deeply engage with model architecture and training scripts.
Stars
613
Forks
85
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 01, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/alexiglad/EBT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
NVIDIA-NeMo/NeMo
A scalable generative AI framework built for researchers and developers working on Large...
vlm-run/vlmrun-hub
A hub for various industry-specific schemas to be used with VLMs.
HyperGAI/HPT
HPT - Open Multimodal LLMs from HyperGAI
yash9439/Falcon-Local-AI-Model
Explore this GitHub repository housing 3 versions of Falcon code for text generation. Each...
bastien-muraccioli/svlr
SVLR: Scalable, Training-Free Visual Language Robotics: a modular multi-model framework for...