INTERACT 2021

31 August 2021

Our group contributes  the following 6 papers to the 18th International Conference On Human-Computer Interaction in Bari, Italy.

 

Exploring Emotions and Emotion Elicitation Techniques in Virtual Reality

In this paper, we explore how state-of-the-art methods of emotion elicitation can be adapted in virtual reality (VR). We envision that emotion research could be conducted in VR for various benefits, such as switching study conditions and settings on the fly, and conducting studies using stimuli that are not easily accessible in the real world such as to induce fear. To this end, we conducted a user study (N=39) where we measured how different emotion elicitation methods (audio, video, image, autobiographical memory recall) perform in VR compared to the real world. We found that elicitation methods produce largely comparable results between the virtual and real world, but overall participants experience slightly stronger valence and arousal in VR. Emotions faded over time following the same pattern in both worlds. Our findings are beneficial to researchers and practitioners studying or using emotional user interfaces in VR.

rivu2021interact1.jpg Radiah Rivu, Ruoyu Jiang, Ville Mäkelä, Mariam Hassib and Florian Alt. Exploring Emotions and Emotion Elicitation Techniques in Virtual Reality. In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany. [Download Bibtex]  

 

Exploring how Saliency Affects Attention in Virtual Reality

We investigate how changes in the saliency of a Virtual Environment (VE) affect our visual attention during different tasks. In particular, we investigate if users are attracted to the most salient regions in the VE. This knowledge will help researchers design optimal VR environments, purposefully direct the attention of users, and avoid unintentional distractions. We conducted a user study (N=30) where participants performed tasks (video watching, object stacking, visual search, waiting) with two different saliency conditions in the virtual environment. Our findings suggest that while participants notice the differences in saliency, their visual attention is not diverted towards the salient regions when they are performing tasks.

rivu2021interact3.jpg Radiah Rivu, Ville Mäkelä, Mariam Hassib, Yomna Abdelrahman and Florian Alt. Exploring how Saliency Affects Attention in Virtual Reality. In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany. [Download Bibtex]

 

When Friends become Strangers: Understanding the Influence of Avatar Gender On Interpersonal Distance Between Friends in Virtual Reality

In this paper, we investigate how mismatches between biological gender and avatar gender affect interpersonal distance (IPD) in
virtual reality (VR). An increasing number of VR experiences and online platforms like Rec Room and VRChat allow users to assume other genders through customized avatars. While the effects of acquaintanceship and gender have been studied with regard to proxemic behavior, the effect of changed genders remains largely unexplored. We conducted a user study (N = 40, friends = 20, strangers = 20) where users played a two-player collaborative game in Rec Room using both male and female avatars. We found that with swapped avatar genders, the preferred distance increased between friends but not between strangers. We discuss how our results can inform researchers and designers in the domain of multi-user VR.

rivu2021interact2.jpg Radiah Rivu, Yumeng Zhou, Robin Welsch, Ville Mäkelä and Florian Alt. When Friends become Strangers: Understanding the Influence of Avatar Gender On Interpersonal Distance Between Friends in Virtual Reality. In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany. [Download Bibtex]

 

Gaze-adaptive Information Access in AR: Empirical Study and Field-Deployment

This paper presents the results of an empirical study and a real-world deployment of a gaze-adaptive UI for Augmented Reality
(AR). AR introduces an attention dilemma between focusing on the reality vs. on AR content. Past work suggested eye gaze as a technique to open information interfaces, however there is only little empirical work. We present an empirical study comparing gaze-adaptive to an always-on interface in tasks that vary focus between reality and virtual content. Across tasks, we find most participants prefer the gaze-adaptive UI and find it less distracting. When focusing on reality, the gaze UI is faster, perceived as easier and more intuitive. When focusing on virtual content, always-on is faster but user preferences are split. We conclude with the design and deployment of an interactive application in a public museum, demonstrating the promising potential in the real world.

piening2021interact.jpg Robin Piening, Ken Pfeuffer, Augusto Esteves, Tim Mittermeier, Sarah Prange, Philippe Schroeder and Florian Alt. Gaze-adaptive Information Access in AR: Empirical Study and Field-Deployment. In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany. [Download Bibtex]

 

Investigating User Perceptions Towards Wearable Mobile Electromyography

Wearables capture physiological user data, enabling novel user interfaces that can identify users, adapt to the user state, and contribute to the quantified self. At the same time, little is known about users’ perception of this new technology. In this paper, we present findings from a user study (N=36) in which participants used an electromyography (EMG) wearable and a visualization of data collected from EMG wearables. We found that participants are highly unaware of what EMG data can reveal about them.
Allowing them to explore their physiological data makes them more reluctant to share this data. We conclude with deriving guidelines, to help designers of physiological data-based user interfaces to (a) protect users’ privacy, (b) better inform them, and (c) ultimately support the uptake of this technology.

prange2021interact.jpg Sarah Prange, Sven Mayer, Maria-Lena Bittl, Mariam Hassib and Florian Alt. Investigating User Perceptions Towards Wearable Mobile Electromyography. In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany. [Download Bibtex]  

 

Passphrases Beat Thermal Attacks: Evaluating Text Input Characteristics Against Thermal Attacks on Laptops and Smartphones

We investigate the effectiveness of thermal attacks against input of text with different characteristics; we study text entry on a smartphone touchscreen and a laptop keyboard. First, we ran a study (N=25) to collect a dataset of thermal images of short words, websites, complex strings (special characters, numbers, letters), passphrases and words with duplicate characters. Afterwards, 20 different participants visually inspected the thermal images to attempt to identify the text input. We found that long and complex strings are less vulnerable to thermal attacks, that visual inspection of thermal images reveals different parts of the entered text (36% on average and up to 82%) even if the attack is not fully successful, and that entering text on laptops is more vulnerable to thermal attacks than on smartphones. We conclude with three learned lessons and recommendations to resist thermal attacks.

Yasmeen Abdrabou, Reem Hatem, Yomna Abdelrahman, Amr Elmougy and Mohamed Khamis. Passphrases Beat Thermal Attacks: Evaluating Text Input Characteristics Against Thermal Attacks on Laptops and Smartphones. In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany. [Download Bibtex]