L-Zhe/BTmPG
Code for paper Pushing Paraphrase Away from Original Sentence: A Multi-Round Paraphrase Generation Approach by Zhe Lin, Xiaojun Wan. This paper is accepted by Findings of ACL'21.
This project helps natural language processing researchers and practitioners generate multiple distinct paraphrases for a given sentence. You input an original sentence, and it outputs several rephrased versions that convey the same meaning but use different wording. This is useful for anyone working on text data augmentation or evaluating paraphrase generation models.
No commits in the last 6 months.
Use this if you need to create a diverse set of paraphrases for an original sentence, pushing beyond simple rephrasing to produce more varied outputs.
Not ideal if you need a simple, single-round paraphrase generator or a tool for highly nuanced, domain-specific text rewrites without extensive training.
Stars
14
Forks
5
Language
Python
License
MIT
Category
Last pushed
Aug 10, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/L-Zhe/BTmPG"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
princeton-nlp/SimCSE
[EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821
n-waves/multifit
The code to reproduce results from paper "MultiFiT: Efficient Multi-lingual Language Model...
yxuansu/SimCTG
[NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation
alibaba-edu/simple-effective-text-matching
Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".
Shark-NLP/OpenICL
OpenICL is an open-source framework to facilitate research, development, and prototyping of...