[MP-FT]-Banner

Face Tracking

[MP-FT]-Tab

For human-robot interaction it is incredibly important to detect the face of the persons interacting with the robot. This can help the robot to detect the subject that is speaking to it and/or he emotions of the subjects interacting with it. Event cameras promise high temporal resolution (and low latency) tracking of facial expressions, however, the task is more complex when the subject does not move as it is not easy to detect the face features (eyes, nose, mouth...) in order to localize the face.

To this aim, the temporal pattern of the eye blinking can be used to detect and localize the eyes and based on the distance between the eyes we can also detect the face. Although there are works that have made use of eye blinks to detect and track the face, we are working on a method that is robust to different light conditions (artificial and natural) and scales.

The proposed method achieves quite remarkable results for different human actions. We divided the dataset into three different actions like staying static in front of the camera, approach and move away from the camera and move from left to right in front of the camera. We selected these actions as moving behaviors that a human subject can take in front of a robot.

In all cases, the results are greater than 50%, thus making it possible for the tracker to follow the eyes and the head.

Dynamic Vision Sensor, Pattern recognition, blink detection, Extended Kalman Filter.

[MP-FT]-Selected_Publications