ChaokunHong/MetaScreener
AI-powered tool for efficient abstract and PDF screening in systematic reviews.
This tool helps researchers, academics, and systematic review specialists quickly screen through large numbers of research papers for systematic reviews. You upload search results from databases like PubMed or Scopus, along with your review criteria (PICO/PEO/SPIDER), and it provides include/exclude decisions for each paper's title and abstract, complete with confidence scores. High-confidence decisions are automated, while uncertain cases are flagged for human review, significantly speeding up the screening process.
1,304 stars. Actively maintained with 226 commits in the last 30 days. Available on PyPI.
Use this if you conduct systematic reviews and need to efficiently sift through hundreds or thousands of research paper abstracts to identify relevant studies.
Not ideal if your screening needs are very small-scale (e.g., fewer than 50 papers) or if you require full-text screening without title/abstract pre-screening.
Stars
1,304
Forks
47
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 27, 2026
Commits (30d)
226
Dependencies
22
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ChaokunHong/MetaScreener"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
asreview/asreview
Active learning for systematic reviews
aritraroy24/ComProScanner
A python package for extracting composition-property data from scientific articles for building databases
MikkelVembye/AIscreenR
AI screening tools in R for systematic reviewing
EvoTestOps/AISysRev
Web-application for LLM based title-abstract screening in systematic reviews
IAAR-Shanghai/SurveyX
Academic Survey Paper Generation.