tiingweii-shii/Awesome-Resource-Efficient-LLM-Papers

a curated list of high-quality papers on resource-efficient LLMs 🌱

36
/ 100
Emerging

This is a curated list of research papers focused on making Large Language Models (LLMs) more efficient, particularly concerning their computational resource demands. It helps AI researchers and machine learning engineers stay updated on techniques to reduce memory, processing, and energy costs associated with training and deploying LLMs. The resource provides direct links to papers covering various optimization strategies, from architectural design to fine-tuning and inference.

158 stars. No commits in the last 6 months.

Use this if you are a researcher or engineer looking for academic papers on optimizing LLM performance and resource consumption.

Not ideal if you are looking for ready-to-use software libraries, code examples, or practical tutorials for LLM deployment.

AI research Machine learning engineering LLM optimization Resource efficiency Deep learning
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

158

Forks

10

Language

License

CC0-1.0

Last pushed

Mar 15, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/tiingweii-shii/Awesome-Resource-Efficient-LLM-Papers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.