Moving the head and eye position to gaze at a moving target allows the iCub to keep focus on an object that may otherwise go outside the field-of-view. The sparse, low-latency sensor signal of the event camera can be exploited to achieve high frequency (> 1 kHz) tracking without a GPU. The low-latency signal tightens the control loop, resulting in a more responsive and accurate robot motion. A particle-filter tracking method dynamically estimates the temporal parameters needed for variation in both the target (a ball) speed and the robot movement. A tailored latency-control feedback mechanism is used to ensure real-time operation despite variation in the camera event-rate.
Successful tracking was achieved at over 1 kHz, improving the state-of-the-art by enabling tracking over a wide variety of velocities. During periods of slow, or little motion, the required computation is inherently reduced as the camera produces less data to process. Tracking was maintained even with little data through the on-line adaptation of the dynamic parameters.
Glover A. and Bartolozzi C. Robust visual tracking with a freely-moving event camera, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3769-3776
Glover A. and Bartolozzi C. Event-driven ball detection and gaze fixation in clutter, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2203-2208
Glover A., Vasco V., and Bartolozzi C. A Controlled-Delay Event Camera Framework for On-Line Robotics, 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 2178-2183