susilnem/American-sign-Language

A CNN based human computer interface for American Sign Language recognition for hearing-impaired individuals

39
/ 100
Emerging

This project helps hearing-impaired individuals communicate by translating American Sign Language (ASL) gestures into written text or speech. You use a webcam or camera to perform ASL signs, and the system processes these visual inputs to output understandable text or spoken words. This tool is designed for anyone looking to bridge the communication gap between the deaf and hearing communities.

No commits in the last 6 months.

Use this if you need a way to translate live ASL gestures into text or speech for improved communication.

Not ideal if you need a system for translating full ASL sentences or complex grammatical structures beyond individual signs.

ASL communication deaf-hearing accessibility sign language translation assistive technology language interpretation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

22

Forks

10

Language

Python

License

Apache-2.0

Last pushed

Sep 14, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/voice-ai/susilnem/American-sign-Language"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.