L-Zhe/BTmPG

Code for paper Pushing Paraphrase Away from Original Sentence: A Multi-Round Paraphrase Generation Approach by Zhe Lin, Xiaojun Wan. This paper is accepted by Findings of ACL'21.

36
/ 100
Emerging

This project helps natural language processing researchers and practitioners generate multiple distinct paraphrases for a given sentence. You input an original sentence, and it outputs several rephrased versions that convey the same meaning but use different wording. This is useful for anyone working on text data augmentation or evaluating paraphrase generation models.

No commits in the last 6 months.

Use this if you need to create a diverse set of paraphrases for an original sentence, pushing beyond simple rephrasing to produce more varied outputs.

Not ideal if you need a simple, single-round paraphrase generator or a tool for highly nuanced, domain-specific text rewrites without extensive training.

natural-language-generation text-data-augmentation computational-linguistics semantic-similarity content-variation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

14

Forks

5

Language

Python

License

MIT

Last pushed

Aug 10, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/L-Zhe/BTmPG"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.