Modern communication technologies allow sensitive information to be accessed anywhere and anytime - for example, on personal devices or in the cloud. At the same time, protecting such information is still a major challenge. One of the main reasons is that many security mechanisms lack good usability, i.e. they interrupt the users' current task due to not blending with how users currently interact, are often time-consuming, and put a significant cognitive load on the user. As a result, users, in general, do not embrace security but rather try to minimize the required effort (e.g., reusing passwords, choosing easy-to-remember passwords, and writing down passwords), hence adversely affecting security.  

Recently, technologies emerge that allow the physiological state of users to be assessed, which creates the potential to address many of the challenges prevalent in state-of-the-art security mechanisms. One such technology is eye tracking. Today, many laptops, tablets, and smartphones come equipped with infrared cameras that can accurately estimate the users' gaze behavior. This enabled novel security concepts. For example, gaze characteristics can be leveraged by so-called implicit authentication mechanisms to verify a user's identity seamlessly in the background while they are performing their current task without interruption. Furthermore, existing secure interfaces can benefit: by monitoring eye gaze upon registering a password, a system could infer the complexity of the password or whether the user chose a password previously used for another account; or a system may be capable of identifying phishing emails from how users perceive its content (e.g., phishing emails often demand immediate action, hence increasing their stress level and ultimately their pupil dilation; or users may repeatedly scan suspicious URLs, resulting in regressions in the gaze data). Subsequently, a system could initiate appropriate interventions.

In our research we focus on how such novel concepts based on physiological data can be designed. In particular we investigate challenges, such as (1) understanding people's physiological responses in security-critical situations; (2) creating methods that enable physiological data to be collected in security-critical situations; and (3) developing mechanisms that protect users' privacy while assessing their physiological state to ultimately foster acceptance. 


katsini2020chi.jpg Christina Katsini, Yasmeen Abdrabou, George E. Raptidis, Mohamed Khamis and Florian Alt. The Role of Eye Gaze in Security and Privacy Applications:Survey and Future HCI Research Directions. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. CHI '20. ACM, New York, NY, USA. [Download Bibtex] [Video]
abdrabou2020etra.jpg Yasmeen Abdrabou, Ken Pfeuffer, Mohamed Khamis and Florian Alt. GazeLockPatterns: Comparing Authentication Using Gaze andTouch for Entering Lock Patterns. In Proceedings of the 2020 ACM Symposium on Eye Tracking Research & Applications. ETRA '20. ACM, New York, NY, USA. [Download Bibtex] [Video]
khamis2018mobilehci.jpg Mohamed Khamis, Florian Alt and Andreas Bulling. The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services. MobileHCI '18. ACM, New York, NY, USA. [Download Bibtex]
khamis2017icmi.jpg Mohamed Khamis, Mariam Hassib, Emanuel von Zezschwitz, Andreas Bulling and Florian Alt. GazeTouchPIN: Protecting Sensitive Data on Mobile Devices using Secure Multimodal Authentication. In Proceedings of the 19th ACM International Conference on Multimodal Interaction. ICMI 2017. ACM, New York, NY, USA. [Download Bibtex]
khamis2018imwut.jpg Mohamed Khamis, Daniel Buschek, Tobias Thieron, Florian Alt and Andreas Bulling. EyePACT: Eye-Based Parallax Correction on Touch-Enabled Interactive Displays. In Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., 1, 2018, 146:1--146:18. [Download Bibtex]