Article ID Journal Published Year Pages File Type
413349 Robotics and Autonomous Systems 2015 10 Pages PDF
Abstract

•Efficient coding principle is used as a criterion for learning smooth pursuit eye movements.•A multi-scale approach allows to perceive a large range of motions.•The model is fully self-calibrating and autonomously recovers from perturbations in the perception/action link.•Experiments on both simulation and iCub robot demonstrate the approach.

This paper presents a model for the autonomous learning of smooth pursuit eye movements based on an efficient coding criterion for active perception. This model accounts for the joint development of visual encoding and eye control. Sparse coding models encode the incoming data at two different spatial resolutions and capture the statistics of the input in spatio-temporal basis functions. A reinforcement learner controls eye velocity so as to maximize a reward signal based on the efficiency of the encoding. We consider the embodiment of the approach in the iCub simulator and real robot. Motion perception and smooth pursuit control are not explicitly expressed as tasks for the robot to achieve but emerge as the result of the system’s active attempt to efficiently encode its sensory inputs. Experiments demonstrate that the proposed approach is self-calibrating and robust to strong perturbations of the perception–action link.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , , , ,