agamm/semantic-split
A Python library to chunk/group your texts based on semantic similarity.
When preparing documents for AI applications, it's essential to organize content efficiently. This tool takes long articles or documents and intelligently breaks them into smaller, semantically similar chunks. This pre-processing helps AI systems understand context better, leading to more accurate and cost-effective responses for users asking questions about the content.
103 stars. No commits in the last 6 months.
Use this if you are building an AI application where you need to provide focused, relevant sections of long texts to a language model or a vector database to improve accuracy and reduce processing costs.
Not ideal if you need a simple, fixed-size text chunking solution that doesn't consider the meaning of sentences, or if your application doesn't involve semantic search or language models.
Stars
103
Forks
9
Language
Python
License
—
Category
Last pushed
Jul 11, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/agamm/semantic-split"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jparkerweb/semantic-chunking
🍱 semantic-chunking ⇢ semantically create chunks from large document for passing to LLM workflows
drittich/SemanticSlicer
🧠✂️ SemanticSlicer — A smart text chunker for LLM-ready documents.
smart-models/Normalized-Semantic-Chunker
Cutting-edge tool that unlocks the full potential of semantic chunking
ndgigliotti/afterthoughts
Sentence-aware embeddings using late chunking with transformers.
ReemHal/Semantic-Text-Segmentation-with-Embeddings
Uses GloVe embeddings and greedy sequence segmentation to semantically segment a text document...