TayeeChang/keras_transformers

the implement of transformer family such as bert, alber, roberta, nezha, etc.

35
/ 100
Emerging

This project offers Keras implementations of various Transformer models like BERT, RoBERTa, and T5, enabling machine learning engineers to apply advanced natural language processing. It takes official pre-trained model weights as input and provides ready-to-use models for downstream NLP tasks such as text classification or generation. This tool is for machine learning engineers and researchers working on building and deploying NLP applications.

No commits in the last 6 months.

Use this if you are a machine learning engineer who needs to quickly prototype or deploy state-of-the-art Transformer models within a Keras and TensorFlow 1.x or 2.x environment.

Not ideal if you are not comfortable with deep learning frameworks or if you need a solution that doesn't require direct model implementation.

natural-language-processing deep-learning text-analysis language-modeling machine-learning-engineering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

7

Forks

4

Language

Python

License

Apache-2.0

Last pushed

Jan 18, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/TayeeChang/keras_transformers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.