hazdzz/converter

The official PyTorch implementation of Converter.

22
/ 100
Experimental

This is a tool for machine learning researchers and practitioners working with neural networks, specifically those interested in exploring different architectural forms. It takes a pre-trained Transformer model and converts it into a Dynamic Graph Neural Network (DGNN) form. The output is a re-structured model that maintains the core functionality but operates with a different underlying architecture.

No commits in the last 6 months.

Use this if you are a machine learning researcher or practitioner who wants to convert existing Transformer models into a DGNN structure for experimentation or to leverage the benefits of graph-based architectures.

Not ideal if you are looking for a general-purpose model conversion tool or if you are not working with Transformer or DGNN architectures.

Machine-Learning-Research Neural-Network-Architecture Transformer-Models Graph-Neural-Networks Deep-Learning-Experimentation
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

7

Forks

Language

Python

License

MIT

Last pushed

Jun 04, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/hazdzz/converter"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.