jha-lab/txf_design-space
[JAIR'23] FlexiBERT tool for Transformer design space exploration.
FlexiBERT helps machine learning engineers and researchers explore and evaluate a wider range of Transformer neural network designs for natural language processing tasks. You provide a description of the architectural search space, and the tool helps generate, train, and assess custom Transformer models, giving you optimized models that perform well on your specific NLP datasets. This is ideal for those who need to move beyond standard, pre-defined Transformer models and tailor architectures to their unique project requirements.
No commits in the last 6 months.
Use this if you need to design and optimize custom Transformer architectures for specific natural language processing tasks, rather than relying on existing, homogeneous models.
Not ideal if you simply need to use a pre-trained Transformer model for a common NLP task without custom architectural exploration.
Stars
9
Forks
1
Language
Python
License
BSD-3-Clause
Category
Last pushed
Apr 03, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/jha-lab/txf_design-space"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
transformerlab/transformerlab-app
The open source research environment for AI researchers to seamlessly train, evaluate, and scale...
naru-project/naru
Neural Relation Understanding: neural cardinality estimators for tabular data
neurocard/neurocard
State-of-the-art neural cardinality estimators for join queries
danielzuegner/code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from...
salesforce/CodeTF
CodeTF: One-stop Transformer Library for State-of-the-art Code LLM