OpenNLPLab/TransnormerLLM
Official implementation of TransNormerLLM: A Faster and Better LLM
This project offers a new type of Large Language Model (LLM) that generates text more quickly and efficiently while maintaining high accuracy. It takes in various prompts and queries, providing generated text, summaries, or answers as output. This is for anyone building applications that rely on text generation, summarization, or advanced natural language understanding.
252 stars. No commits in the last 6 months.
Use this if you need a powerful language model for text generation or understanding that prioritizes both speed and performance, especially across multiple languages.
Not ideal if you prefer using established Transformer-based models exclusively or if your application requires extremely small model sizes.
Stars
252
Forks
11
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/OpenNLPLab/TransnormerLLM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
mosaicml/llm-foundry
LLM training code for Databricks foundation models
rickiepark/llm-from-scratch
<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소