ngoquanghuy99/transformer-summarization

An abstractive text summarization model based on Transformer Decoder (GPT-2) using Google/Trax.

19
/ 100
Experimental

This tool helps content creators, journalists, and researchers quickly distill long articles or documents into concise summaries. It takes full-length text as input and generates shorter, human-readable summaries, saving time in content review and information digestion. It is ideal for anyone needing to grasp the essence of large volumes of text efficiently.

No commits in the last 6 months.

Use this if you need to automatically create brief, standalone summaries from longer pieces of text.

Not ideal if you need to extract specific sentences from the original text (extractive summarization) rather than generate new summary text.

content-creation journalism research-analysis information-digestion text-analysis
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 5 / 25

How are scores calculated?

Stars

17

Forks

1

Language

Jupyter Notebook

License

Last pushed

Dec 25, 2020

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/ngoquanghuy99/transformer-summarization"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.