Moving the head and eye position to gaze at a moving target allows the iCub to keep focus on an object that may otherwise go outside the field-of-view. The sparse, low-latency sensor signal of the event camera can be exploited to achieve high frequency (> 1 kHz) tracking without a GPU. The low-latency signal tightens the control loop, resulting in a more responsive and accurate robot motion. A particle-filter tracking method dynamically estimates the temporal parameters needed for variation in both the target (a ball) speed and the robot movement. A tailored latency-control feedback mechanism is used to ensure real-time operation despite variation in the camera event-rate.
Successful tracking was achieved at over 1 kHz, improving the state-of-the-art by enabling tracking over a wide variety of velocities. During periods of slow, or little motion, the required computation is inherently reduced as the camera produces less data to process. Tracking was maintained even with little data through the on-line adaptation of the dynamic parameters.