soskek/bert-chainer
Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
This project helps developers working with Chainer to use Google's pre-trained BERT models for various natural language tasks. It takes Google's BERT models (originally in TensorFlow format) and converts them into a Chainer-compatible format. Developers can then use these models for tasks like classifying sentences, answering questions, or extracting semantic features from text.
224 stars. No commits in the last 6 months.
Use this if you are a machine learning developer familiar with Chainer and want to integrate powerful, pre-trained BERT models into your natural language processing applications without rebuilding them from scratch.
Not ideal if you are looking to pre-train new BERT models on custom datasets or require multilingual BERT support, as these features are not implemented.
Stars
224
Forks
40
Language
Python
License
—
Category
Last pushed
Nov 09, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/soskek/bert-chainer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
JayYip/m3tl
BERT for Multitask Learning
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.