Adamdad/KnowledgeFactor
[ECCV2022] Factorizing Knowledge in Neural Networks
This helps machine learning engineers and researchers to break down a large, pre-trained neural network model into smaller, specialized 'factor networks'. You provide an existing model, and it outputs these smaller networks, each retaining specific knowledge from the original but focused on a dedicated sub-task. This is useful for those working on model optimization, specialization, or understanding how knowledge is stored within complex AI systems.
No commits in the last 6 months.
Use this if you need to decompose a complex, pre-trained neural network into modular components, each handling a specific sub-task.
Not ideal if you are looking for a tool to train neural networks from scratch or for general model development unrelated to knowledge factorization.
Stars
91
Forks
5
Language
Python
License
Apache-2.0
Category
Last pushed
Sep 12, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Adamdad/KnowledgeFactor"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)