TimSchneider42/tactile-mnist

The Tactile-MNIST Active Perception Benchmark

46
/ 100
Emerging

This project provides a benchmark and datasets for developing active tactile perception algorithms. It takes simulated or real tactile sensor data and helps researchers train robots or intelligent systems to classify, count, or estimate properties like position and volume of objects through touch. This is designed for robotics researchers and AI developers working on robotic manipulation and sensory perception.

Available on PyPI.

Use this if you are developing or evaluating algorithms for robots to understand objects by actively touching them, rather than just seeing them.

Not ideal if your focus is on visual object recognition or if you require physical hardware beyond a simulated environment for initial development.

robotics tactile sensing robot learning object classification active perception
Maintenance 10 / 25
Adoption 5 / 25
Maturity 25 / 25
Community 6 / 25

How are scores calculated?

Stars

13

Forks

1

Language

Python

License

MIT

Last pushed

Feb 13, 2026

Commits (30d)

0

Dependencies

11

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/TimSchneider42/tactile-mnist"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.