jha-lab/txf_design-space

[JAIR'23] FlexiBERT tool for Transformer design space exploration.

29
/ 100
Experimental

FlexiBERT helps machine learning engineers and researchers explore and evaluate a wider range of Transformer neural network designs for natural language processing tasks. You provide a description of the architectural search space, and the tool helps generate, train, and assess custom Transformer models, giving you optimized models that perform well on your specific NLP datasets. This is ideal for those who need to move beyond standard, pre-defined Transformer models and tailor architectures to their unique project requirements.

No commits in the last 6 months.

Use this if you need to design and optimize custom Transformer architectures for specific natural language processing tasks, rather than relying on existing, homogeneous models.

Not ideal if you simply need to use a pre-trained Transformer model for a common NLP task without custom architectural exploration.

Natural Language Processing Machine Learning Engineering Neural Architecture Search Model Optimization Deep Learning Research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

9

Forks

1

Language

Python

License

BSD-3-Clause

Last pushed

Apr 03, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/jha-lab/txf_design-space"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.