Nealcly/BiLSTM-LAN
Hierarchically-Refined Label Attention Network for Sequence Labeling
This project helps natural language processing researchers develop models that accurately assign labels to words in a sentence, which is crucial for understanding text. It takes raw text data and pre-trained word embeddings, then outputs a model capable of high-accuracy sequence labeling. NLP practitioners and researchers focused on tasks like Part-of-Speech tagging or Named Entity Recognition would use this.
293 stars. No commits in the last 6 months.
Use this if you need to train a robust model for sequence labeling tasks like Part-of-Speech tagging or Named Entity Recognition with state-of-the-art performance.
Not ideal if you are looking for a pre-trained, ready-to-use model for general text classification or sentiment analysis.
Stars
293
Forks
49
Language
Python
License
Apache-2.0
Category
Last pushed
Apr 09, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/Nealcly/BiLSTM-LAN"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
charles9n/bert-sklearn
a sklearn wrapper for Google's BERT model
jidasheng/bi-lstm-crf
A PyTorch implementation of the BI-LSTM-CRF model.
howl-anderson/seq2annotation
基于 TensorFlow & PaddlePaddle 的通用序列标注算法库(目前包含 BiLSTM+CRF, Stacked-BiLSTM+CRF 和...
kamalkraj/BERT-NER
Pytorch-Named-Entity-Recognition-with-BERT
kamalkraj/Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs
Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs