rPPG-Toolbox and MA-rPPG-Video-Toolbox

rPPG-Toolbox
53
Established
MA-rPPG-Video-Toolbox
47
Emerging
Maintenance 2/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 6/25
Adoption 9/25
Maturity 16/25
Community 16/25
Stars: 966
Forks: 248
Downloads:
Commits (30d): 0
Language: Python
License:
Stars: 80
Forks: 12
Downloads:
Commits (30d): 0
Language: Python
License:
Stale 6m No Package No Dependents
No Package No Dependents

About rPPG-Toolbox

ubicomplab/rPPG-Toolbox

rPPG-Toolbox: Deep Remote PPG Toolbox (NeurIPS 2023)

This platform helps researchers and developers working with camera-based physiological sensing, known as remote photoplethysmography (rPPG). It takes standard video recordings of a person's face and outputs physiological signals like heart rate, without needing physical contact. It's designed for biomedical engineers, computer vision researchers, and data scientists developing or benchmarking non-contact vital sign monitoring systems.

remote vital signs monitoring non-contact health sensing biomedical signal processing computer vision for physiology physiological measurement

About MA-rPPG-Video-Toolbox

yahskapar/MA-rPPG-Video-Toolbox

The source code and pre-trained models for Motion Matters: Neural Motion Transfer for Better Camera Physiological Sensing (WACV 2024, Oral).

This tool helps researchers studying camera-based vital sign monitoring by generating synthetic video data. You provide existing videos of subjects and a separate set of "driving" videos that contain various body movements. The tool then creates new videos where the original subjects appear to perform the movements from the driving videos, while preserving their original physiological signals. This augmented data helps improve the accuracy of models that measure heart rate or other vital signs from video.

biometric-sensing physiological-measurement medical-imaging computer-vision data-augmentation

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work