soskek/bert-chainer

Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"

38
/ 100
Emerging

This project helps developers working with Chainer to use Google's pre-trained BERT models for various natural language tasks. It takes Google's BERT models (originally in TensorFlow format) and converts them into a Chainer-compatible format. Developers can then use these models for tasks like classifying sentences, answering questions, or extracting semantic features from text.

224 stars. No commits in the last 6 months.

Use this if you are a machine learning developer familiar with Chainer and want to integrate powerful, pre-trained BERT models into your natural language processing applications without rebuilding them from scratch.

Not ideal if you are looking to pre-train new BERT models on custom datasets or require multilingual BERT support, as these features are not implemented.

natural-language-processing deep-learning-development text-classification question-answering language-understanding
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 20 / 25

How are scores calculated?

Stars

224

Forks

40

Language

Python

License

Last pushed

Nov 09, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/soskek/bert-chainer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.