sudharsan13296/Getting-Started-with-Google-BERT
Build and train state-of-the-art natural language processing models using BERT
This project offers a comprehensive guide for developers looking to build and fine-tune advanced natural language processing models. It explains how to use BERT and its variants for tasks like sentiment analysis, text summarization, and question answering. The typical user is a machine learning engineer or data scientist working with text data.
226 stars. No commits in the last 6 months.
Use this if you need to implement state-of-the-art NLP models using BERT and its various architectures.
Not ideal if you are a business user looking for a ready-to-use application, rather than a development guide.
Stars
226
Forks
84
Language
Jupyter Notebook
License
—
Category
Last pushed
May 20, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/sudharsan13296/Getting-Started-with-Google-BERT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
JayYip/m3tl
BERT for Multitask Learning
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.