yahshibu/nested-ner-tacl2020-transformers
Implementation of Nested Named Entity Recognition using BERT
This is a developer tool for creating advanced natural language processing models. It takes text data, often from specialized corpora like ACE-2004 or GENIA, and outputs a trained model capable of identifying 'nested' named entities within the text. This is primarily for machine learning engineers or NLP researchers who need to build or experiment with sophisticated information extraction systems.
137 stars. No commits in the last 6 months.
Use this if you are an NLP developer or researcher looking to implement or reproduce experiments in nested named entity recognition using BERT-based models.
Not ideal if you are an end-user seeking a pre-built solution for extracting information from text without needing to train custom models or work with code.
Stars
137
Forks
24
Language
Python
License
GPL-3.0
Category
Last pushed
Oct 29, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/yahshibu/nested-ner-tacl2020-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
charles9n/bert-sklearn
a sklearn wrapper for Google's BERT model
jidasheng/bi-lstm-crf
A PyTorch implementation of the BI-LSTM-CRF model.
howl-anderson/seq2annotation
基于 TensorFlow & PaddlePaddle 的通用序列标注算法库(目前包含 BiLSTM+CRF, Stacked-BiLSTM+CRF 和...
kamalkraj/BERT-NER
Pytorch-Named-Entity-Recognition-with-BERT
kamalkraj/Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs
Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs