ngoquanghuy99/transformer-summarization
An abstractive text summarization model based on Transformer Decoder (GPT-2) using Google/Trax.
This tool helps content creators, journalists, and researchers quickly distill long articles or documents into concise summaries. It takes full-length text as input and generates shorter, human-readable summaries, saving time in content review and information digestion. It is ideal for anyone needing to grasp the essence of large volumes of text efficiently.
No commits in the last 6 months.
Use this if you need to automatically create brief, standalone summaries from longer pieces of text.
Not ideal if you need to extract specific sentences from the original text (extractive summarization) rather than generate new summary text.
Stars
17
Forks
1
Language
Jupyter Notebook
License
—
Category
Last pushed
Dec 25, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/ngoquanghuy99/transformer-summarization"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kenlimmj/rouge
A Javascript implementation of the Recall-Oriented Understudy for Gisting Evaluation (ROUGE)...
uoneway/KoBertSum
KoBertSum은 BertSum모델을 한국어 데이터에 적용할 수 있도록 수정한 한국어 요약 모델입니다.
udibr/headlines
Automatically generate headlines to short articles
bheinzerling/pyrouge
A Python wrapper for the ROUGE summarization evaluation package
xiongma/transformer-pointer-generator
A Abstractive Summarization Implementation with Transformer and Pointer-generator