jimmiemunyi/Sign-Language-App

Training a model to recognize Sign Language then running inference on the Webcam

33
/ 100
Emerging

This project helps people communicate by translating American Sign Language (ASL) gestures into text in real-time. It takes live video input from a webcam and identifies the ASL signs being made, outputting the corresponding letters or words. This tool is for educators, interpreters, or individuals learning or interacting with ASL.

No commits in the last 6 months.

Use this if you need a real-time tool to interpret ASL gestures captured via webcam, for educational or communication purposes.

Not ideal if you require a comprehensive ASL translation system that understands complex sentences, facial expressions, or nuanced grammar beyond individual signs.

sign-language communication-aid ASL-learning gesture-recognition accessibility
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

27

Forks

3

Language

Jupyter Notebook

License

MIT

Last pushed

Nov 21, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/jimmiemunyi/Sign-Language-App"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.