linonetwo/segmentit
任何 JS 环境可用的中文分词包,fork from leizongmin/node-segment
This tool helps you break down Chinese text into individual words or phrases, a process called word segmentation. It takes raw Chinese text as input and outputs a list of segmented words, often with their grammatical categories like noun, verb, or adjective. It's designed for anyone working with Chinese text in web browsers or desktop applications who needs to analyze or process the language.
311 stars. No commits in the last 6 months. Available on npm.
Use this if you need to accurately segment Chinese text and assign part-of-speech tags within a web browser (like for an interactive web tool) or a desktop application built with Electron.
Not ideal if your primary development environment is Node.js for backend processing, as other established libraries might be more suitable and performant.
Stars
311
Forks
17
Language
JavaScript
License
MIT
Category
Last pushed
Apr 13, 2023
Commits (30d)
0
Dependencies
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/linonetwo/segmentit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
PyThaiNLP/pythainlp
Thai natural language processing in Python
hankcs/HanLP
Natural Language Processing for the next decade. Tokenization, Part-of-Speech Tagging, Named...
jacksonllee/pycantonese
Cantonese Linguistics and NLP
dongrixinyu/JioNLP
中文 NLP 预处理、解析工具包,准确、高效、易用 A Chinese NLP Preprocessing & Parsing Package www.jionlp.com
hankcs/pyhanlp
中文分词