Skip to main content Skip to main navigation


Autonomous Learning of Page Flipping Movements via Tactile Feedback

Yi Zheng; Filipe Veiga; Jan Peters; Veronica J. Santos
In: IEEE Transactions on Robotics (T-RO), Vol. 38, No. 5, Pages 2734-2749, IEEE, 2022.


Robotic manipulation is challenging when both the objects being manipulated and the tactile sensors are deformable. In this article, we addressed the interplay between the manipulation of deformable objects, tactile sensing, and model-free reinforcement learning on a real robot. We showed how a real robot can learn to manipulate a deformable, thin-shell object via feedback from deformable, multimodal tactile sensors. We addressed the learning of a page flipping task using a two-stage approach. For the first stage, we learned nominal page flipping trajectories for two page sizes by constructing a reward function that quantifies functional task performance from the perspective of tactile sensing. For the second stage, we learned adapted trajectories using tactile-driven perceptual coupling, with an intuitive assumption that, while the page flipping trajectories for different task contexts (page sizes) might differ, similar tactile feedback should be expected from functional trajectories for each context. We also investigated the quality of information encoded by two different representations of tactile sensing data: one based on the artificial apical tuft of bio-inspired tactile sensors, and another based on principal component analysis eigenvalues. The results and effectiveness of our learning framework were demonstrated on a real seven-degree of freedom robot arm and gripper outfitted with tactile sensors.

Weitere Links