VQA-Team/Visual-Question-Answering

The project is an Android application aimed to help the visually impaired by giving them the ability to take a picture, ask questions about it and the application will provide them with the answers using machine learning techniques and tools.

27
/ 100
Experimental

This Android application helps visually impaired individuals understand their surroundings better. Users can take a picture and verbally ask questions about its content. The app processes the image and question to provide spoken answers, acting as a personal visual assistant.

No commits in the last 6 months.

Use this if you are visually impaired and need a convenient way to get spoken descriptions and answers about objects and scenes in your immediate environment.

Not ideal if you need detailed, nuanced descriptions or if the environment involves highly complex or abstract visual information.

visual-assistance accessibility-tech sight-impairment daily-living-aids environmental-awareness
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 15 / 25

How are scores calculated?

Stars

7

Forks

4

Language

Jupyter Notebook

License

Last pushed

May 28, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/VQA-Team/Visual-Question-Answering"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.