Chongjie-Si/Subspace-Tuning

A generalized framework for subspace tuning methods in parameter efficient fine-tuning.

46
/ 100
Emerging

This framework helps machine learning researchers and practitioners efficiently adapt large, pre-trained AI models for specific tasks without needing to retrain the entire model. It takes an existing large language or image generation model and, with minimal modifications, produces a specialized model capable of tasks like natural language understanding, question answering, or subject-driven image generation. It's for those working with large models who need to fine-tune them for diverse applications while saving computational resources.

177 stars.

Use this if you are a machine learning researcher or engineer looking to fine-tune large pre-trained models for specific tasks like NLU, NLG, or image generation, and want to do so efficiently without modifying all the model's parameters.

Not ideal if you are looking for a complete, out-of-the-box solution for end-users, or if you need to train models from scratch rather than adapt existing large ones.

large-language-models model-adaptation deep-learning-research natural-language-processing generative-ai
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

177

Forks

10

Language

Python

License

Apache-2.0

Last pushed

Jan 29, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Chongjie-Si/Subspace-Tuning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.