lovelovetrb/xlstm-lm
このリポジトリは、xLSTMを用いた言語モデルを学習するコードを実装したものです。 論文「xLSTM: Extended Long Short-Term Memory」に基づいて実装しています。
No commits in the last 6 months.
Stars
2
Forks
—
Language
Python
License
—
Category
Last pushed
Feb 04, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/lovelovetrb/xlstm-lm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers-bloom-inference
Fast Inference Solutions for BLOOM
Tencent/TurboTransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc)...
mit-han-lab/lite-transformer
[ICLR 2020] Lite Transformer with Long-Short Range Attention
mit-han-lab/hardware-aware-transformers
[ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
LibreTranslate/Locomotive
Toolkit for training/converting LibreTranslate compatible language models 🚂