google-deepmind/dks

Multi-framework implementation of Deep Kernel Shaping and Tailored Activation Transformations, which are methods that modify neural network models (and their initializations) to make them easier to train.

35
/ 100
Emerging

This package helps machine learning engineers and researchers make their deep neural networks easier and faster to train, especially those without skip connections or normalization layers. It provides specialized activation function transformations, weight initializations, and optional data preprocessing. The output is a modified neural network model structure that performs as well as more complex architectures like ResNets.

No commits in the last 6 months.

Use this if you are a machine learning practitioner struggling to train deep, 'vanilla' convolutional networks efficiently and want to achieve faster convergence without adding architectural complexities like skip connections.

Not ideal if you are primarily using ReLU activation functions, as they are only partially supported, or if you are looking for a fully automated, black-box solution for network training optimization.

deep-learning neural-network-training model-optimization computer-vision ml-research
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

76

Forks

5

Language

Python

License

Apache-2.0

Last pushed

Jul 01, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/google-deepmind/dks"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.