DarshanDeshpande/jax-models
Unofficial JAX implementations of deep learning research papers
This collection provides ready-to-use deep learning models, layers, and utilities implemented in JAX/Flax, based on recent research papers. It takes a model name (like 'swin-tiny-224') and outputs a configured deep learning model ready for use in JAX/Flax. Researchers and machine learning engineers working with JAX/Flax can leverage this to quickly experiment with cutting-edge architectures.
161 stars. No commits in the last 6 months. Available on PyPI.
Use this if you are a machine learning researcher or engineer building deep learning models with JAX/Flax and want access to pre-implemented architectures from recent academic papers.
Not ideal if you are not working with the JAX/Flax deep learning framework or are looking for models outside of the computer vision domain.
Stars
161
Forks
10
Language
Python
License
Apache-2.0
Category
Last pushed
Jun 25, 2022
Commits (30d)
0
Dependencies
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/DarshanDeshpande/jax-models"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
Denis2054/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning,...
abhimishra91/transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks