Publication
Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation
N. Funk; E. Helmut; G. Chalvatzaki; R. Calandra; Jan Peters
In: IEEE Transactions on Robotics (T-RO), Vol. 40, Pages 3812-3832, Institute of Electrical and Electronics Engineers (IEEE), 2024.
Abstract
Optical tactile sensors have recently become popular.
They provide high spatial resolution, but struggle to offer fine
temporal resolutions. To overcome this shortcoming, we study
the idea of replacing the RGB camera with an event-based
camera and introduce a new event-based optical tactile sensor
called Evetac. Along with hardware design, we develop touch
processing algorithms to process its measurements online at 1000
Hz. We devise an efficient algorithm to track the elastomer’s
deformation through the imprinted markers despite the sensor’s
sparse output. Benchmarking experiments demonstrate Evetac’s
capabilities of sensing vibrations up to 498 Hz, reconstructing
shear forces, and significantly reducing data rates compared
to RGB optical tactile sensors. Moreover, Evetac’s output and
the marker tracking provide meaningful features for learning
data-driven slip detection and prediction models. The learned
models form the basis for a robust and adaptive closed-loop grasp
controller capable of handling a wide range of objects. We believe
that fast and efficient event-based tactile sensors like Evetac will
be essential for bringing human-like manipulation capabilities
to robotics. The sensor design and additional material is opensourced at https://sites.google.com/view/evetac.