Project Description

The ubihave project aims to investigate how ubiquitous computing devices can benefit from behavioral models. This research is motivated by the fact that computers permeate our life, as everyday companions (smartphones, tablets, life-logging wristbands, smart clothes) and as sensors embedded in the environment (WiFi hotspots, NFC, cameras, depth sensors, eye trackers). These devices provide rich streams of user-specific data, opening new avenues for applications using behavioral models. Exploiting and combining ubiquitous data sources for modeling user behavior, the Human-Computer Interaction community has the unprecedented chance to now realize long held visions of intelligent interfaces, smart devices, and reactive environments which appropriately adapt to individual users and contexts.

Many current UIs and devices can react to simple sensor properties. However, interfaces and interactions are rarely adapted to the individual user and context, since this requires dedicated inference tools to process uncertain user-specific sensor data. To render user-specific information more accessible and useful to applications and users, this project aims to build and apply models that can describe, analyze and predict user behavior based on data of mobile devices and ubiquitous sensors. Particular application areas that we expect to strongly benefit from such models are usable privacy and security, touch interaction, text input, and context-aware adaptive interfaces.

This project integrates HCI and user modeling perspectives to address a number of guiding questions: In which applications and contexts can users benefit from behavioral models? How can models improve interactions? Do users notice and like adaptations and do they match their expectations? Which features, models and algorithms are suitable to capture and utilize user behavior? How do we define performance metrics for user actions with behavior-aware interfaces? Which interactions provoke characteristic and consistent behavior?

The contribution of the project is threefold. Firstly, we chart a holistic design space to understand and investigate user modeling comprehensively across tasks and beyond the desktop. This reveals future opportunities and a goal unique to this project: we aim to identify common grounds for diverse applications based on the same user representation, leading to efficient data handling across applications. Secondly, we use deployment-based research to identify scenarios in which behavioral biometrics help to optimize, personalize and secure interactions. Examples include novel usable security mechanisms, efficient and usable mobile text entry, activity-aware applications, and novel mobile services that can adapt to user behavior. Thirdly, to allow applications to consider user-specific interaction characteristics and behavior, this project also develops inference tools, which can process uncertain sensor data with respect to the targeted user contexts and goals.