In this paper we show how reading text on large display can be used to enable gaze interaction in public space. Our research is motivated by the fact that much of the content on public displays includes text. Hence, researchers and practitioners  could greatly benefit from users being able to spontaneously interact as well as to implicitly calibrate an eye tracker while simply reading this text. In particular, we adapt Pursuits, a technique that correlates users’ eye movements with moving on-screen targets. While prior work used abstract objects or dots as targets, we explore the use of Pursuits with text (read-and-pursue). Thereby we address the challenge that eye movements performed while reading interfere with the pursuit
movements. Results from two user studies (N=37) show that Pursuits with text is feasible and can achieve similar accuracy as non text-based pursuit approaches. While calibration is less accurate, it integrates smoothly with reading and allows areas of the display the user is looking at to be identified.

Publication

khamis2016ubicomp.jpg Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling und Florian Alt. TextPursuits: Using Text for Pursuits-based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. UbiComp '16. ACM, New York, NY, USA. [Download Bibtex]