DAMO-NLP-SG/CLEX
[ICLR 2024] CLEX: Continuous Length Extrapolation for Large Language Models
This project offers enhanced large language models that can handle much longer texts without losing accuracy. It takes existing models like LLaMA-2 or Mixtral and modifies them to process inputs up to 8 times longer than their original training. This is useful for anyone working with very long documents, conversations, or codebases who needs a language model to maintain context across extensive content.
No commits in the last 6 months.
Use this if you need a language model to analyze, summarize, or generate text from extremely long documents, like entire books, lengthy research papers, or extensive legal contracts, without encountering context window limitations.
Not ideal if your primary use case involves short, conversational queries or if you require a language model that has not been specifically re-trained or fine-tuned for extended context.
Stars
78
Forks
11
Language
Python
License
MIT
Category
Last pushed
Mar 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/DAMO-NLP-SG/CLEX"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ZHZisZZ/dllm
dLLM: Simple Diffusion Language Modeling
pengzhangzhi/Open-dLLM
Open diffusion language model for code generation — releasing pretraining, evaluation,...
EnnengYang/Awesome-Model-Merging-Methods-Theories-Applications
Model Merging in LLMs, MLLMs, and Beyond: Methods, Theories, Applications and Opportunities. ACM...
THUDM/LongWriter
[ICLR 2025] LongWriter: Unleashing 10,000+ Word Generation from Long Context LLMs
AIoT-MLSys-Lab/SVD-LLM
[ICLR 2025🔥] SVD-LLM & [NAACL 2025🔥] SVD-LLM V2