yahskapar/MA-rPPG-Video-Toolbox

The source code and pre-trained models for Motion Matters: Neural Motion Transfer for Better Camera Physiological Sensing (WACV 2024, Oral).

47
/ 100
Emerging

This tool helps researchers studying camera-based vital sign monitoring by generating synthetic video data. You provide existing videos of subjects and a separate set of "driving" videos that contain various body movements. The tool then creates new videos where the original subjects appear to perform the movements from the driving videos, while preserving their original physiological signals. This augmented data helps improve the accuracy of models that measure heart rate or other vital signs from video.

Use this if you need to create more diverse training data with controlled motion for camera-based physiological sensing models, especially when real-world data with varied motion is limited.

Not ideal if you are looking for a tool to directly analyze physiological signals from videos or if you do not work with machine learning models for camera-based vital sign monitoring.

biometric-sensing physiological-measurement medical-imaging computer-vision data-augmentation
No Package No Dependents
Maintenance 6 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

80

Forks

12

Language

Python

License

Last pushed

Dec 29, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/yahskapar/MA-rPPG-Video-Toolbox"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.