Adamdad/KnowledgeFactor

[ECCV2022] Factorizing Knowledge in Neural Networks

33
/ 100
Emerging

This helps machine learning engineers and researchers to break down a large, pre-trained neural network model into smaller, specialized 'factor networks'. You provide an existing model, and it outputs these smaller networks, each retaining specific knowledge from the original but focused on a dedicated sub-task. This is useful for those working on model optimization, specialization, or understanding how knowledge is stored within complex AI systems.

No commits in the last 6 months.

Use this if you need to decompose a complex, pre-trained neural network into modular components, each handling a specific sub-task.

Not ideal if you are looking for a tool to train neural networks from scratch or for general model development unrelated to knowledge factorization.

deep-learning model-optimization neural-network-analysis knowledge-distillation AI-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

91

Forks

5

Language

Python

License

Apache-2.0

Last pushed

Sep 12, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Adamdad/KnowledgeFactor"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.