asreview/asreview-makita

Workflow generator for simulation studies using the command line interface of ASReview LAB

54
/ 100
Established

This tool helps researchers studying systematic reviews automate the setup of large-scale simulation studies. You provide your datasets, and it generates the necessary framework, code, and batch scripts to run numerous simulations. It's designed for researchers who want to test the performance of different active learning models in a systematic review context.

No commits in the last 6 months. Available on PyPI.

Use this if you need to systematically evaluate various active learning strategies for systematic reviews and ensure your research is fully reproducible.

Not ideal if you are looking for a tool that executes the simulations directly or writes your research paper for you.

systematic-reviews research-automation active-learning simulation-studies literature-screening
Stale 6m
Maintenance 2 / 25
Adoption 8 / 25
Maturity 25 / 25
Community 19 / 25

How are scores calculated?

Stars

42

Forks

19

Language

Python

License

MIT

Last pushed

Jun 23, 2025

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/asreview/asreview-makita"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.