TayeeChang/keras_transformers
the implement of transformer family such as bert, alber, roberta, nezha, etc.
This project offers Keras implementations of various Transformer models like BERT, RoBERTa, and T5, enabling machine learning engineers to apply advanced natural language processing. It takes official pre-trained model weights as input and provides ready-to-use models for downstream NLP tasks such as text classification or generation. This tool is for machine learning engineers and researchers working on building and deploying NLP applications.
No commits in the last 6 months.
Use this if you are a machine learning engineer who needs to quickly prototype or deploy state-of-the-art Transformer models within a Keras and TensorFlow 1.x or 2.x environment.
Not ideal if you are not comfortable with deep learning frameworks or if you need a solution that doesn't require direct model implementation.
Stars
7
Forks
4
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 18, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/TayeeChang/keras_transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Tongjilibo/bert4torch
An elegent pytorch implement of transformers
nyu-mll/jiant
jiant is an nlp toolkit
lonePatient/TorchBlocks
A PyTorch-based toolkit for natural language processing
monologg/JointBERT
Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
grammarly/gector
Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite"...