SAP-samples/acl2022-self-contrastive-decorrelation

Source code for ACL 2022 paper "Self-contrastive Decorrelation for Sentence Embeddings".

39
/ 100
Emerging

This project provides a new method for generating high-quality sentence embeddings, which are numerical representations of sentences that capture their meaning. It takes raw text sentences as input and produces more accurate embeddings without needing specialized paired data. This is useful for anyone working with natural language processing tasks who needs to compare or understand the semantic similarity between pieces of text.

No commits in the last 6 months.

Use this if you need to create better sentence embeddings for tasks like semantic search, text classification, or question-answering, especially when you have limited paired data for training.

Not ideal if you are a business user looking for a ready-to-use application; this is a codebase requiring development skills to implement.

natural-language-processing text-analytics information-retrieval semantic-search machine-learning-engineering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

26

Forks

6

Language

Python

License

Apache-2.0

Last pushed

Mar 10, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/SAP-samples/acl2022-self-contrastive-decorrelation"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.