TimSchneider42/tactile-mnist
The Tactile-MNIST Active Perception Benchmark
This project provides a benchmark and datasets for developing active tactile perception algorithms. It takes simulated or real tactile sensor data and helps researchers train robots or intelligent systems to classify, count, or estimate properties like position and volume of objects through touch. This is designed for robotics researchers and AI developers working on robotic manipulation and sensory perception.
Available on PyPI.
Use this if you are developing or evaluating algorithms for robots to understand objects by actively touching them, rather than just seeing them.
Not ideal if your focus is on visual object recognition or if you require physical hardware beyond a simulated environment for initial development.
Stars
13
Forks
1
Language
Python
License
MIT
Category
Last pushed
Feb 13, 2026
Commits (30d)
0
Dependencies
11
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/TimSchneider42/tactile-mnist"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
enactic/openarm
A fully open-source humanoid arm for physical AI research and deployment in contact-rich environments.
Shuijing725/awesome-robot-social-navigation
A curated list of robot social navigation.
thomashiemstra/fred
This my 3d printed robot arm project
jstmn/ikflow
Open source implementation to the paper "IKFlow: Generating Diverse Inverse Kinematics Solutions"
sizhe-li/neural-jacobian-field
Controlling diverse robots by inferring jacobian fields with deep networks! Let's make robots...