hellohaptik/multi-task-NLP
multi_task_NLP is a utility toolkit enabling NLP developers to easily train and infer a single model for multiple tasks.
This toolkit helps NLP developers efficiently build conversational AI systems. It takes raw text data for various natural language understanding (NLU) tasks and outputs a single, consolidated model capable of performing all those tasks. The end-users are NLP developers who want to create more streamlined and less resource-intensive AI components.
373 stars. No commits in the last 6 months.
Use this if you are an NLP developer building a conversational AI system and need to manage multiple NLU tasks with a single, optimized model to reduce resource consumption and latency.
Not ideal if you are a non-developer or only need to train a single-task NLP model without concerns for multi-task efficiency.
Stars
373
Forks
51
Language
Python
License
Apache-2.0
Category
Last pushed
Nov 21, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/hellohaptik/multi-task-NLP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
KoichiYasuoka/esupar
Tokenizer POS-Tagger and Dependency-parser with BERT/RoBERTa/DeBERTa/GPT models for Japanese and...
old-wang-95/easy-bert
easy-bert是一个中文NLP工具,提供诸多bert变体调用和调参方法,极速上手;清晰的设计和代码注释,也很适合学习
taishi-i/nagisa_bert
A BERT model for nagisa
ant-louis/netbert
📶 NetBERT: a domain-specific BERT model for computer networking.
AndyTheFactory/RO-Diacritics
Python package for Romanian diacritics restoration