Sign-Language-Interpreter-using-Deep-Learning and Sign-Language-Recognition

Maintenance 2/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 2/25
Adoption 6/25
Maturity 16/25
Community 16/25
Stars: 740
Forks: 251
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 15
Forks: 7
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About Sign-Language-Interpreter-using-Deep-Learning

harshbg/Sign-Language-Interpreter-using-Deep-Learning

A sign language interpreter using live video feed from the camera.

This project helps deaf individuals communicate more easily by translating American Sign Language (ASL) gestures into text in real-time. It takes a live video feed from a camera, identifies the hand signs, and outputs the corresponding letters or words. This tool is designed for deaf people who want a personal, always-available translator for daily communication without needing a human interpreter.

assistive-technology accessibility deaf-community sign-language daily-communication

About Sign-Language-Recognition

CodingSamrat/Sign-Language-Recognition

A Machine Learning model that will be able to classify the various hand gestures used for finger spelling in sign language

This project helps bridge communication gaps by recognizing hand gestures in real-time, specifically finger spelling. You can input live video of hand signs, and the system outputs the corresponding text or speech. This tool is designed for anyone needing to interpret sign language or custom hand gestures.

sign-language-interpretation assistive-technology real-time-communication human-computer-interaction gesture-recognition

Scores updated daily from GitHub, PyPI, and npm data. How scores work