ChaokunHong/MetaScreener

AI-powered tool for efficient abstract and PDF screening in systematic reviews.

70
/ 100
Verified

This tool helps researchers, academics, and systematic review specialists quickly screen through large numbers of research papers for systematic reviews. You upload search results from databases like PubMed or Scopus, along with your review criteria (PICO/PEO/SPIDER), and it provides include/exclude decisions for each paper's title and abstract, complete with confidence scores. High-confidence decisions are automated, while uncertain cases are flagged for human review, significantly speeding up the screening process.

1,304 stars. Actively maintained with 226 commits in the last 30 days. Available on PyPI.

Use this if you conduct systematic reviews and need to efficiently sift through hundreds or thousands of research paper abstracts to identify relevant studies.

Not ideal if your screening needs are very small-scale (e.g., fewer than 50 papers) or if you require full-text screening without title/abstract pre-screening.

systematic-review literature-review research-screening evidence-synthesis academic-research
Maintenance 22 / 25
Adoption 10 / 25
Maturity 24 / 25
Community 14 / 25

How are scores calculated?

Stars

1,304

Forks

47

Language

Python

License

Apache-2.0

Last pushed

Feb 27, 2026

Commits (30d)

226

Dependencies

22

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ChaokunHong/MetaScreener"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.