rPPG-Toolbox and MA-rPPG-Video-Toolbox
About rPPG-Toolbox
ubicomplab/rPPG-Toolbox
rPPG-Toolbox: Deep Remote PPG Toolbox (NeurIPS 2023)
This platform helps researchers and developers working with camera-based physiological sensing, known as remote photoplethysmography (rPPG). It takes standard video recordings of a person's face and outputs physiological signals like heart rate, without needing physical contact. It's designed for biomedical engineers, computer vision researchers, and data scientists developing or benchmarking non-contact vital sign monitoring systems.
About MA-rPPG-Video-Toolbox
yahskapar/MA-rPPG-Video-Toolbox
The source code and pre-trained models for Motion Matters: Neural Motion Transfer for Better Camera Physiological Sensing (WACV 2024, Oral).
This tool helps researchers studying camera-based vital sign monitoring by generating synthetic video data. You provide existing videos of subjects and a separate set of "driving" videos that contain various body movements. The tool then creates new videos where the original subjects appear to perform the movements from the driving videos, while preserving their original physiological signals. This augmented data helps improve the accuracy of models that measure heart rate or other vital signs from video.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work