charles9n/bert-sklearn
a sklearn wrapper for Google's BERT model
This tool helps data scientists and machine learning engineers streamline the process of fine-tuning large language models for text-based tasks. It takes raw text or text pairs and their corresponding labels as input, allowing you to train powerful models for classification, regression, or sequence labeling. The output is a trained model capable of making predictions on new text data, which can then be saved and reused.
301 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to quickly adapt advanced language models like BERT, SciBERT, or BioBERT for specific text understanding tasks without deep expertise in model architecture.
Not ideal if you are looking for a no-code solution or if your primary focus is on traditional machine learning models rather than deep learning for natural language.
Stars
301
Forks
70
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Oct 26, 2022
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/charles9n/bert-sklearn"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
jidasheng/bi-lstm-crf
A PyTorch implementation of the BI-LSTM-CRF model.
howl-anderson/seq2annotation
基于 TensorFlow & PaddlePaddle 的通用序列标注算法库(目前包含 BiLSTM+CRF, Stacked-BiLSTM+CRF 和...
kamalkraj/BERT-NER
Pytorch-Named-Entity-Recognition-with-BERT
kamalkraj/Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs
Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs
guillaumegenthial/tf_ner
Simple and Efficient Tensorflow implementations of NER models with tf.estimator and tf.data