Sign-Language-Interpreter-using-Deep-Learning and Sign-Language-Translator

Maintenance 2/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 0/25
Adoption 9/25
Maturity 16/25
Community 19/25
Stars: 740
Forks: 251
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 90
Forks: 21
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About Sign-Language-Interpreter-using-Deep-Learning

harshbg/Sign-Language-Interpreter-using-Deep-Learning

A sign language interpreter using live video feed from the camera.

This project helps deaf individuals communicate more easily by translating American Sign Language (ASL) gestures into text in real-time. It takes a live video feed from a camera, identifies the hand signs, and outputs the corresponding letters or words. This tool is designed for deaf people who want a personal, always-available translator for daily communication without needing a human interpreter.

assistive-technology accessibility deaf-community sign-language daily-communication

About Sign-Language-Translator

dgovor/Sign-Language-Translator

Neural Network that is able to translate any sign language into text.

This project helps individuals create a customized translation tool that turns sign language gestures into written text. You provide video recordings of your specific hand signs, and the system learns to recognize them, outputting real-time predictions of signed sentences, complete with grammar correction. It's designed for anyone who needs to translate a particular sign language into text, especially for improving communication.

sign-language-interpretation assistive-communication human-computer-interaction custom-gesture-recognition

Scores updated daily from GitHub, PyPI, and npm data. How scores work