Développement d’une peau artificielle pour l’apprentissage d’interactions physiques et sociales sur un robot humanoïde
Ganna Pugach. Développement d’une peau artificielle pour l’apprentissage d’interactions physiques et sociales sur un robot humanoïde. Robotique [cs.RO]. Université de Cergy Pontoise, 2017. Français
Abstract : The touch perception is considered as one of the crucial senses to be recreated in a robot so that it could generate a more flexible and agile behavior. For instance, grasping an object, as well as touch or be touched by a person. Although modern touch sensors are still very limited compared to the human skin, combined with vision and proprioception, the development of new sensors similar to human skin could multiply the robot’s capacity to interact directly and safely with a person, as well as to share his or her physical and social environment.Unlike human skin, the main touch sensors used in modern robotics are only capable of detecting the pressure and weight variations on small batches of surface. Moreover, they are often quite stiff and do not have the elastic deformation capacity intrinsic to the human skin. The purpose of this thesis is to develop a touch interface close to “artificial skin” in terms of the covered area (which can reach several square decimeters) and localization of the contact points (several dozen millinewtons). Two main aspects have been developed: (i) the engineering aspect including the development of an artificial skin prototype for a humanoid robot designed to impart a tactile perception, and (ii) the cognitive aspect that is based on the integration of multiple sensory feedbacks (tactile, visual, proprioceptive) in order to conceive a robot that can physically interact with people.The developed tactile prototype is based on the reconstruction of the electric field on the surface of a conductive material, following the principle of Electrical Impedance Tomography (EIT). Our main innovation was to implement the neural network learning techniques to reconstruct the information without using the inverse matrix analytical techniques which imply time consuming computation. Moreover, we show that the application of artificial neural networks allows to obtain a much more biomimetic system, essential to understand the perception of the human touch.Then, we addressed the issue of integrating tactile and motor information. After having covered a manipulator arm with artificial skin, we have learn a neural network its body schema and enables it to adjust its compliance with tactile feedback. The functioning of the motor is based on the admittance control of the robot arm. Experiments show that neural networks can control the adaptive interaction between the robot arm and a human being by estimating the torque perceived according to the position where the touch force had been applied during the learning phase.Finally, we turned our attention to the issue of the body representation at the neuronal level, namely, how human beings perceive their own body through all their senses (visual, tactile, and proprioceptive). We have proposed a biological model in the parietal cortex, which is based on the integration of multiple sensory feedbacks from the robot’s body (its arm) and on the synchronization of visual and proprioceptive feedback. Our results show the capacity to perceive the body image with the emergence of neurons that encode a spatial visual-tactile information of the arm movement and is centered on either the robotic arm or on the object.