shubhamdey01/Minor-Project
This project focuses on evaluating six abstractive summarization models (Seq2Seq with Attention, BERTSUMABS, BART, T5, PEGASUS, XLNet) on benchmark datasets (CNN/DailyMail, XSum, Gigaword). The models were analyzed using ROUGE and BLEU metrics to measure fluency, coherence, and content accuracy.
No commits in the last 6 months.
Stars
—
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jan 02, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/shubhamdey01/Minor-Project"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
abelriboulot/onnxt5
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using...
pszemraj/textsum
CLI & Python API to easily summarize text-based files with transformers
rojagtap/transformer-abstractive-summarization
Abstractive Text Summarization using Transformer
HHousen/DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine...
abhilash1910/LongPegasus
LongPegasus package is used for inducing longformer self attention over base pegasus abstractive...