RiaanSadiq/Sign-Language-Detection

Sign Language Detection using CNN and Flask. This project uses Convolutional Neural Networks (CNN) to detect and classify sign language gestures. It includes a Flask web application for real-time detection. Simply use your webcam to translates gestures into text.

20
/ 100
Experimental

This project helps bridge communication gaps by instantly translating sign language gestures into text using a webcam. Simply point your camera at a sign language user, and the application will display the corresponding words. It's designed for anyone who needs real-time interpretation of American Sign Language (ASL) gestures, such as educators, customer service professionals, or family members.

No commits in the last 6 months.

Use this if you need to understand sign language gestures in real-time and want a visual, text-based translation.

Not ideal if you need to translate complex conversations, detect highly nuanced or non-ASL gestures, or require translation without a webcam.

sign-language-translation communication-accessibility real-time-interpretation inclusive-education
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

8

Forks

1

Language

Jupyter Notebook

License

Last pushed

Jun 26, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/RiaanSadiq/Sign-Language-Detection"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.