aneesahmedpro/rock-paper-scissor-vision

"Recognise Hand-Posture using Computer Vision." Make the machine look at the hand of a person using live camera feed and recognise the postures made by the hand (rock, paper or scissor) in real-time using a convolutional neural network.

28
/ 100
Experimental

This tool helps developers integrate real-time hand gesture recognition into applications. It takes live camera feed as input and identifies rock, paper, or scissor hand postures, providing the recognized gesture as output. Developers creating interactive experiences or simple gesture-controlled interfaces would use this.

No commits in the last 6 months.

Use this if you are a developer building a Python application that needs to recognize rock, paper, or scissor hand gestures from a live camera.

Not ideal if you are a non-developer looking for an out-of-the-box application, or if you need to recognize a wider range of hand gestures beyond rock, paper, or scissor.

real-time gesture recognition computer vision development interactive application development Python development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

8

Forks

1

Language

Jupyter Notebook

License

MIT

Last pushed

Jan 01, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/aneesahmedpro/rock-paper-scissor-vision"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.