SAP-samples/acl2022-self-contrastive-decorrelation
Source code for ACL 2022 paper "Self-contrastive Decorrelation for Sentence Embeddings".
This project provides a new method for generating high-quality sentence embeddings, which are numerical representations of sentences that capture their meaning. It takes raw text sentences as input and produces more accurate embeddings without needing specialized paired data. This is useful for anyone working with natural language processing tasks who needs to compare or understand the semantic similarity between pieces of text.
No commits in the last 6 months.
Use this if you need to create better sentence embeddings for tasks like semantic search, text classification, or question-answering, especially when you have limited paired data for training.
Not ideal if you are a business user looking for a ready-to-use application; this is a codebase requiring development skills to implement.
Stars
26
Forks
6
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 10, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/SAP-samples/acl2022-self-contrastive-decorrelation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
n-waves/multifit
The code to reproduce results from paper "MultiFiT: Efficient Multi-lingual Language Model...
princeton-nlp/SimCSE
[EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821
yxuansu/SimCTG
[NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation
alibaba-edu/simple-effective-text-matching
Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".
Shark-NLP/OpenICL
OpenICL is an open-source framework to facilitate research, development, and prototyping of...