cyd622/nlp-jieba
结巴中文分词(PHP 版本):做最好的 PHP 中文分词、中文断词组件
This tool helps you accurately break down Chinese text into individual words, which is essential for analyzing text, building search engines, or understanding sentiment. It takes raw Chinese sentences and outputs a list of segmented words. Anyone working with Chinese language data, like content analysts, market researchers, or data scientists, would find this useful for processing text.
No commits in the last 6 months.
Use this if you need to precisely segment Chinese text from documents, articles, or user-generated content for analysis or search functionalities.
Not ideal if your primary need is for English text analysis or if you are not working within a PHP environment.
Stars
17
Forks
6
Language
PHP
License
MIT
Category
Last pushed
Feb 28, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/cyd622/nlp-jieba"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
PyThaiNLP/pythainlp
Thai natural language processing in Python
hankcs/HanLP
Natural Language Processing for the next decade. Tokenization, Part-of-Speech Tagging, Named...
jacksonllee/pycantonese
Cantonese Linguistics and NLP
dongrixinyu/JioNLP
中文 NLP 预处理、解析工具包,准确、高效、易用 A Chinese NLP Preprocessing & Parsing Package www.jionlp.com
hankcs/pyhanlp
中文分词