Welcome on the websites of the Usable Security and Privacy Group!

Research

Behavioral Biometrics

State-of-the-art authentication does little to account for the way in which users interact. For example, knowledge-based authentication schemes such as PIN, lock patterns, or passwords were invented at times where people authenticated with very few devices and only a few times per day. At the same time, as a result of crowd services and smart devices, users today access sensitive information through 100 accounts on average and authenticate several hundres times a day. This leads to a significant overhead in authentication times and users are forced to use more passwords than they can remember.

In contrast, implicit authentication schemes hold a lot of promise as they can operate seemlessly in the background. In particular, we look at how knowledge about users' behavior can be leveraged to build authentication schemes that are both more usable and more secure. We are interested in the technical foundations of such schemes (How can we build suitable authentication models? How can we use them across applications and contexts?) and in the users' view (How can implicit mechanisms be built that preserve users' privacy? How can the cappentace of users' be increased?).

Physiology-enhanced Secure Systems

Physiological information about users becomes available at a rapidly accelerating pace. In particular wearables, such as wristbands, eyewear, and smart garment allows information to be obtained on users' physiological state, including but not limited to eye gaze, EEG, heart rate, and skin conductance. We are particularly interested in how such information can be leveraged in the context of building secure systems.

We are both interested in how novel secure interaction techniques and mechanisms can be built by using physiological data but also how existing systems can benefit from knowledge on users' current state.

Social Engineering - The Voice of Wisdom

Humans are in many cases the weakest element in security-critical systems. As such, attackers employ social engineering to compromise such systems. Common examples include phishing (fraudulently obtaining private information through emails or deep fakes), water holing (capitalizing on trust users have in frequently visited websites), and baiting (infecting a system with malware by exploiting the curiosity of victims). We are particularly interested in a comprehensive understanding of social engineering attacks and investigate computer-assisted approaches to mitigate such attacks.