Arjun-08/Sequence-to-sequence-networks-for-multi-text-document-summarization
This repository explores the use of advanced sequence-to-sequence networks and transformer models, such as BERT, BART, PEGASUS, and T5, for summarizing multi-text documents in the medical domain. It leverages extensive datasets like CORD-19 and a Biomedical Abstracts dataset from Hugging Face to fine-tune these models.
No commits in the last 6 months.
Stars
2
Forks
—
Language
Jupyter Notebook
License
—
Category
Last pushed
May 17, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/Arjun-08/Sequence-to-sequence-networks-for-multi-text-document-summarization"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kenlimmj/rouge
A Javascript implementation of the Recall-Oriented Understudy for Gisting Evaluation (ROUGE)...
uoneway/KoBertSum
KoBertSum은 BertSum모델을 한국어 데이터에 적용할 수 있도록 수정한 한국어 요약 모델입니다.
udibr/headlines
Automatically generate headlines to short articles
bheinzerling/pyrouge
A Python wrapper for the ROUGE summarization evaluation package
xiongma/transformer-pointer-generator
A Abstractive Summarization Implementation with Transformer and Pointer-generator