EyeScout

While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confinedtracking box restrict users to a single stationary position infront of the display. We present EyeScout, an active eyetracking system that combines an eye tracker mounted ona rail system with a computational method to automaticallydetect and align the tracker with the user’s lateral movement.EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interactionmodes for a single user: In “Walk then Interact” the usercan walk up to an arbitrary position in front of the display andinteract, while in “Walk and Interact” the user can interact evenwhile on the move. We report on a user study that shows thatEyeScout is well perceived by users, extends a public display’ssweet spot into a sweet line, and reduces gaze interaction kickofftime to 3.5 seconds – a 62% improvement over state of theart solutions. We discuss sample applications that demonstratehow EyeScout can enable position and movement-independentgaze interaction with large public displays.

Publication

Mohamed Khamis, Alexander Klimczak, Martin Reiss, Florian Alt und Andreas Bulling. EyeScout: Active Eye Tracking for Position and MovementIndependent Gaze Interaction with Large Public Displays. In Proceedings of the 30th Annual ACM Symposium on User Interface Software & Technology. UIST '17. ACM, New York, NY, USA. [Download Bibtex]