ART-Group-it/KERMIT
🐸 KERMIT - A lightweight library to encode and interpret Universal Syntactic Embeddings
This tool helps natural language processing researchers and practitioners improve the performance of their Transformer models on linguistic tasks. It takes parsed sentence structures (parse trees) as input and provides an encoded representation that can be used to add explicit syntactic information to models like BERT. The output also allows for the interpretation and visualization of how much syntax influences the model's decisions, which is useful for analyzing model behavior.
No commits in the last 6 months.
Use this if you are working with natural language understanding and want to enhance your Transformer models by integrating explicit syntactic information from parse trees to achieve better performance and interpretability.
Not ideal if you are looking for a general-purpose natural language processing library that doesn't focus on detailed syntactic encoding or require explicit parse tree input.
Stars
58
Forks
9
Language
JavaScript
License
MIT
Category
Last pushed
Jan 18, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/ART-Group-it/KERMIT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
FlagOpen/FlagEmbedding
Retrieval and Retrieval-augmented LLMs
qdrant/fastembed
Fast, Accurate, Lightweight Python library to make State of the Art Embedding
Blaizzy/mlx-embeddings
MLX-Embeddings is the best package for running Vision and Language Embedding models locally on...
Merck/Sapiens
Sapiens is a human antibody language model based on BERT.
amansrivastava17/embedding-as-service
One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques