oakink/OakInk

[CVPR 2022] OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction

34
/ 100
Emerging

This project offers a comprehensive repository of human hand and common object interactions, providing 3D models of objects, their parts, and attributes, alongside videos and 3D data of hands grasping and manipulating these objects. It helps researchers and engineers create more realistic and intelligent computer vision and robotics applications. The data includes detailed information about how humans naturally interact with everyday items, such as dynamic grasping and handover motions. It's intended for those working on tasks like generating human-like grasping poses or teaching robots to interact with objects more effectively.

134 stars. No commits in the last 6 months.

Use this if you are developing AI models or robotic systems that need to understand, predict, or generate realistic human-object interactions, especially for grasping, manipulation, and handover scenarios.

Not ideal if your primary focus is on general object recognition or scene understanding without a specific need for detailed hand-object interaction data.

robotics computer-vision human-computer-interaction animation product-design
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

134

Forks

6

Language

Python

License

MIT

Last pushed

Dec 01, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/computer-vision/oakink/OakInk"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.