AnweshCR7/RhythmNet
End-to-end Heart Rate Estimation from Face via Spatial-temporal Representation. A replication of the RhythmNet model.
This project helps medical professionals, fitness coaches, or researchers estimate a person's heart rate by analyzing a video of their face. It takes a video input and outputs an estimated heart rate in beats per minute (bpm). It's designed for anyone needing a non-contact method to monitor heart rate, such as in telehealth, remote patient monitoring, or exercise analysis.
No commits in the last 6 months.
Use this if you need to quickly and non-invasively estimate heart rate from a video feed of a person's face.
Not ideal if you require clinical-grade accuracy for medical diagnosis, as this is a research-grade estimation.
Stars
65
Forks
14
Language
Python
License
MIT
Category
Last pushed
Jul 06, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AnweshCR7/RhythmNet"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
remotebiosensing/rppg
Benchmark Framework for fair evaluation of rPPG
Mobile-Sensing-and-UbiComp-Laboratory/NormWear
A Foundation Model for Multivariate Wearable Sensing of Physiological Signals.
MahdiFarvardin/MEDVSE
Official repository of "Efficient Deep Learning-based Estimation of the Vital Signs on Smartphones".
PhysiologicAILab/FactorizePhys
FactorizePhys: Matrix Factorization for Multidimensional Attention in Remote Physiological...
fr-meyer/MD-ViSCo
MD-ViSCo: A Unified Model for Multi-Directional Vital Sign Waveform Conversion. IEEE JBHI 2026.