TextPursuits

In this paper we show how reading text on large display can beused to enable gaze interaction in public space. Our researchis motivated by the fact that much of the content on public displays includes text. Hence, researchers and practitionerscould greatly benefit from users being able to spontaneouslyinteract as well as to implicitly calibrate an eye tracker whilesimply reading this text. In particular, we adapt Pursuits, atechnique that correlates users’ eye movements with movingon-screen targets. While prior work used abstract objectsor dots as targets, we explore the use of Pursuits with text(read-and-pursue). Thereby we address the challenge that eyemovements performed while reading interfere with the pursuit
movements. Results from two user studies (N=37) show thatPursuits with text is feasible and can achieve similar accuracyas non text-based pursuit approaches. While calibration is lessaccurate, it integrates smoothly with reading and allows areasof the display the user is looking at to be identified.

Publikationen

Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling und Florian Alt. TextPursuits: Using Text for Pursuits-based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. UbiComp '16. ACM, New York, NY, USA. [Download Bibtex]