Arren Glover

Post Doc
Vision Algorithms and Software for the Event-driven iCub Robot

Research Lines



Via Morego 30, Genova, Italy
+39 010 2898 221


Arren received his Bachelor of Engineering (Mechatronics, Honours I) from the University of Queensland (UQ), Australia in 2008 during which he was also a junior research scholar at the Commonwealth Scientific and Industrial Research Organisation (CSIRO). He completed his PhD from the Queensland University of Technology (QUT) in 2013. During this time he worked on affordance learning and sensorimotor coordination using reinforcement learning techniques and Markov decision processes. During 2014 Arren held a Post-doctoral position at QUT in which he also lectured the Roboics Team Project course. His research was focussed on visual place recognition and visual odemtry for SLAM under extreme environmental change.

Arren has been a Post-doctorial researcher at the Istituto Italiano di Tecnologia since September 2014 working on visual algorithms and software architectures for integration of event cameras on the iCub robot.


Robust Visual Tracking with Event Cameras

Event-driven cameras are a new technology that can enable low-latency, fast visual sensing in dynamic environments towards faster robotic vision as they respond only to changes in the scene with very high temporal resolution < 1 microsecond. Moving targets produce dense spatio-temporal streams of events that do not suffer from information loss "between frames", which can occur when traditional cameras are used to track fast-moving targets. Event-based tracking algorithms need to be able to follow the target position within the spatio-temporal data, while rejecting clutter events that occur as a robot moves in a typical office setting. We introduce a particle filter algorithm with the aim to be robust to temporal variation that occurs as the camerand the target move with different relative velocities, which can lead to a loss in visual information and missed detections. The proposed system provides a more persistent tracking compared to prior state-of-the-art, especially when the robot is actively following a target with its gaze. Experiments are performed on the iCub humanoid robot performing ball tracking and gaze following.


iCub-integrated Software Libraries for Event Cameras

The icub-event-driven libraries are open source and can be found at: The libraries include interfaces between event camera sensors and the YARP robot middleware, as well as interfaces between event-driven algorithms and behaviour control of the iCub robot. The libraries include encoding and decoding of event-packets for distributed processing across multiple machines including the robot using YARP, multiple useful data structures for managing the asynchronous event-stream, and many processing algorithms, including filters, camera undistortion, visualisaiton, auto-saccading, circle detection and tracking, cluster tracking, optical flow, and corner detection.

architecture crop











OpenFABMAP is an open-source version of the state-of-the-art visual place recogntion algorithm FAB-MAP; the baseline state-of-the-art for algorithm comparison for the past five years.

Selected Publications

Glover, A., and Bartolozzi C. (2016) Event-driven ball detection and gaze fixation in clutter. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 2016, Daejeon, Korea. Finalist for RoboCup Best Paper Award

Vasco V., Glover A., and Bartolozzi C. (2016) Fast event-based harris corner detection exploiting the advantages of event-driven cameras. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 2016, Daejeon, Korea.

Glover, A., and Wyeth, G. (2016) Towards Lifelong Affordance Learning using a Distributed Markov Model. In IEEE Transactions on Cognitive and Developmental Systems.

V. Vasco, A. Glover, Y. Tirupachuri, F. Solari, M. Chessa, and Bartolozzi C. Vergence control with a neuromorphic iCub. In IEEE-RAS International Conference on Humanoid Robots (Humanoids), November 2016, Mexico.

Glover, Arren, Maddern, William, Milford, Michael, & Wyeth, Gordon (2010) FAB-MAP + RatSLAM : appearance-based SLAM for multiple times of day. In IEEE International Conference on Robotics and Automation (ICRA), May 2010, Anchorage, Alaska, USA.

Schulz, R., Glover, A., Milford, M., Wyeth, G., & Wiles, J. (2011) Lingodroids: Studies in Spatial Cognition and Language. In IEEE International Conference on Robotics and Automation (ICRA), May 2011, Shanghai, China,

Glover, A., Maddern, W., Warren M., Reid S., Milford M., & Wyeth, G. (2012) OpenFABMAP: An Open Source Toolbox for Appearance-based Loop Closure Detection. In IEEE International Conference on Robotics and Automation (ICRA), May 2012, St. Paul, USA.

Milford, M., Scheirer, W., Vig, E., Glover, A., Baumann, O., Mattingley J., Cox D. (2014) Condition-Invariant, Top-Down Visual Place Recognition. In: The International Conference on Robotics and Automation, Hong Kong, June 2014.

Glover, Arren (2014) Developing Grounded Representations for Robots through the Principles of Sensorimotor Coordination. Queensland University of Technology, Brisbane, Australia.