abhi-patil20/HRI-AI
This project aims to develop and showcase various techniques and algorithms for effective interaction between humans and robots through programming.
This project helps roboticists and automation engineers create robots that understand and respond to human input more naturally. It takes human gestures and voice commands as input, translating them into actions the robot can perform, and provides immediate visual or auditory feedback from the robot. This enables better collaboration between humans and robots in various operational settings.
No commits in the last 6 months.
Use this if you need to program robots to interpret human gestures and voice commands, allowing for more intuitive control and interaction.
Not ideal if your primary goal is robot path planning or low-level motor control rather than human-robot communication.
Stars
7
Forks
3
Language
Jupyter Notebook
License
GPL-2.0
Category
Last pushed
Jun 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/abhi-patil20/HRI-AI"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SAP-samples/codejam-cap-llm
This repository contains the content for the CAP and Generative AI Hub CodeJam. It includes...
qxresearch/qxresearch-event-1
Python hands on tutorial with 50+ Python Application (10 lines of code) By @xiaowuc2
diegopacheco/ai-playground
AI POCS: ML, NLP, LLM, Vision, Classification, clustering, GenAI, Transformers, PyTorch, Keras,...
simranjeet97/75DayHard_GenAI_LLM_Challenge
This repository contain my 75Day Hard Generative AI and LLM Learning Challenge.
jedi4ever/learning-llms-and-genai-for-dev-sec-ops
A set of lessons aimed at anyone learning LLM and generative AI concepts, with sections on...