asreview/asreview-makita
Workflow generator for simulation studies using the command line interface of ASReview LAB
This tool helps researchers studying systematic reviews automate the setup of large-scale simulation studies. You provide your datasets, and it generates the necessary framework, code, and batch scripts to run numerous simulations. It's designed for researchers who want to test the performance of different active learning models in a systematic review context.
No commits in the last 6 months. Available on PyPI.
Use this if you need to systematically evaluate various active learning strategies for systematic reviews and ensure your research is fully reproducible.
Not ideal if you are looking for a tool that executes the simulations directly or writes your research paper for you.
Stars
42
Forks
19
Language
Python
License
MIT
Category
Last pushed
Jun 23, 2025
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/asreview/asreview-makita"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
neuml/paperai
📄 🤖 AI for medical and scientific papers
supriya46788/Research-Paper-Organizer
Open-source beginner-friendly project
allenai/papermage
library supporting NLP and CV research on scientific papers
alibaba/AliceMind
ALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab
Tavris1/AI-Toolkit-Easy-Install
One-click Portable Windows installation of 'AI-Toolkit by Ostris'