Ball Tracking and Gaze Following
Moving the head and eye position to gaze at a moving target allows the iCub to maintain focus on an object that may otherwise go outside the field-of-view. The sparse, low-latency sensor signal of the event camera can be exploited to achieve high frequency (> 1 kHz) tracking without a GPU. The low-latency signal tightens the control loop, resulting in a more responsive and accurate robot motion. A particle-filter tracking method dynamically estimates the temporal parameters needed for variation in both the target (a ball) speed and the robot movement. A tailored latency-control feedback mechanism is used to ensure real-time operation despite variation in the camera event-rate.
Successful tracking was achieved at over 1 kHz, improving over the previous state-of-the-art by enabling tracking over a wide variety of velocities. During periods of slow, or little motion, the required compute is inherently reduced as the camera produces less data to process. Tracking was maintained even with little data through the on-line adaption of the dynamic parameters.
Methods: particle filter
People: Arren Glover
Reaching objects with very precise timing is still a challenging problem for robots, due to both perception and control latencies. Our project aims at making the iCub robot able to smoothly interact with fast-moving objects in dynamic environments, relying only on its vision sensors.
To do so, we use as test-bench the air hockey task, because its 2D constrained environment is ideal for testing high-speed motion planning, simple to realise in a normal-size laboratory and characterised by high uncertainty, high-variable trajectories, and presence of disturbances.
Methods & Tools Event-based tracking, Hand-eye Coordination, Ego-motion, Robot Control
People: Luna Gava, Marco Monforte