oakink/OakInk2
🌴[CVPR 2024] OakInk2: A Dataset of Bimanual Hands-Object Manipulation in Complex Task Completion
This project provides a comprehensive dataset for understanding how people use both hands to manipulate objects while completing complex tasks. It includes high-resolution images of bimanual interactions, along with detailed annotations of hand and object movements. This is ideal for researchers and engineers developing robots, virtual reality applications, or AI models that need to accurately perceive and interact with the physical world.
No commits in the last 6 months.
Use this if you are developing computer vision models, robotic systems, or haptic interfaces that require realistic data on human bimanual manipulation of objects in various task scenarios.
Not ideal if you need data on single-hand interactions, highly specialized industrial tasks, or don't require detailed 3D pose and object affordance annotations.
Stars
92
Forks
3
Language
Python
License
—
Category
Last pushed
Aug 11, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/oakink/OakInk2"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
DeepLabCut/DeepLabCut
Official implementation of DeepLabCut: Markerless pose estimation of user-defined features with...
openpifpaf/openpifpaf
Official implementation of "OpenPifPaf: Composite Fields for Semantic Keypoint Detection and...
lambdaloop/anipose
🐜🐀🐒🚶 A toolkit for robust markerless 3D pose estimation
DIYer22/bpycv
Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
NeLy-EPFL/DeepFly3D
Motion capture (markerless 3D pose estimation) pipeline and helper GUI for tethered Drosophila.