x-transformers and simple-hierarchical-transformer

X-transformers is a general-purpose transformer library that simple-hierarchical-transformer builds upon as an experimental architecture variant, making them complements rather than competitors.

x-transformers
79
Verified
Maintenance 20/25
Adoption 15/25
Maturity 25/25
Community 19/25
Maintenance 13/25
Adoption 10/25
Maturity 25/25
Community 11/25
Stars: 5,808
Forks: 507
Downloads:
Commits (30d): 8
Language: Python
License: MIT
Stars: 225
Forks: 13
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No risk flags
No risk flags

About x-transformers

lucidrains/x-transformers

A concise but complete full-attention transformer with a set of promising experimental features from various papers

This project provides pre-built, flexible transformer models for various AI tasks. You can input text, images, or a combination to generate new text, classify images, or create image captions. It's designed for AI researchers and practitioners who want to experiment with advanced transformer architectures without building them from scratch.

natural-language-processing computer-vision multimodal-ai generative-ai machine-learning-research

About simple-hierarchical-transformer

lucidrains/simple-hierarchical-transformer

Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT

This project offers an experimental approach to training large language models (LLMs) more efficiently by introducing multiple levels of data compression. It takes text or token sequences as input and produces logits for next-token prediction, aiming to maintain predictive quality while reducing computational cost. This is for machine learning researchers or practitioners who build and experiment with new LLM architectures.

large-language-models transformer-architecture neural-network-research predictive-modeling computational-efficiency

Scores updated daily from GitHub, PyPI, and npm data. How scores work