andyzeng/visual-pushing-grasping
Train robotic agents to learn to plan pushing and grasping actions for manipulation with deep reinforcement learning.
This project helps roboticists teach industrial robot arms to efficiently pick up objects, even when they are tightly packed or in difficult-to-reach positions. By feeding the robot visual data (like camera images), it learns through trial and error to use both pushing and grasping actions to clear clutter and successfully retrieve items. It's designed for engineers, researchers, or technicians working with robotic manipulation in manufacturing or logistics.
1,087 stars. No commits in the last 6 months.
Use this if you need to train a robot arm for robust pick-and-place tasks involving cluttered environments or objects that require strategic pushing before grasping.
Not ideal if your robot arm only performs simple, unobstructed grasping or if you require a pre-built, non-learning-based solution.
Stars
1,087
Forks
329
Language
Python
License
BSD-2-Clause
Category
Last pushed
May 11, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/andyzeng/visual-pushing-grasping"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
skumra/robotic-grasping
Antipodal Robotic Grasping using GR-ConvNet. IROS 2020.
BerkeleyAutomation/gqcnn
Python module for GQ-CNN training and deployment with ROS integration.
google-research/ravens
Train robotic agents to learn pick and place with deep learning for vision-based manipulation in...
shadow-robot/smart_grasping_sandbox
A public sandbox for Shadow's Smart Grasping System
huangwl18/geometry-dex
PyTorch Code for "Generalization in Dexterous Manipulation via Geometry-Aware Multi-Task Learning"