datawhalechina/learn-nlp-with-transformers
we want to create a repo to illustrate usage of transformers in chinese
This project helps developers and researchers understand and apply advanced Natural Language Processing (NLP) models, specifically transformers, in Chinese contexts. It provides clear explanations of transformer principles and practical projects for tasks like text classification, question answering, and machine translation. This resource is for NLP beginners or those new to transformers who have a basic understanding of Python and PyTorch.
3,143 stars. No commits in the last 6 months.
Use this if you are a developer or researcher looking to learn the theory and practical application of transformer models for NLP, especially with Chinese text.
Not ideal if you are looking for a plug-and-play tool for immediate NLP solutions without needing to understand the underlying models or code.
Stars
3,143
Forks
499
Language
Shell
License
—
Category
Last pushed
Aug 18, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/datawhalechina/learn-nlp-with-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...