INTERACT 2021

31 August 2021

Unsere Gruppe trägt die folgenden 6 Paper zur18th International Conference On Human-Computer Interactionin Bari, Italien, bei.

Exploring Emotions and Emotion Elicitation Techniques in Virtual Reality

In this paper, we explore how state-of-the-art methods ofemotion elicitation can be adapted in virtual reality (VR). We envisionthat emotion research could be conducted in VR for various benefits,such as switching study conditions and settings on the fly, and conducting studies using stimuli that are not easily accessible in the realworld such as to induce fear. To this end, we conducted a user study(N=39) where we measured how different emotion elicitation methods(audio, video, image, autobiographical memory recall) perform in VRcompared to the real world. We found that elicitation methods producelargely comparable results between the virtual and real world, but overall participants experience slightly stronger valence and arousal in VR.Emotions faded over time following the same pattern in both worlds.Our findings are beneficial to researchers and practitioners studying orusing emotional user interfaces in VR.

rivu2021interact1.jpg Radiah Rivu, Ruoyu Jiang, Ville Mäkelä, Mariam Hassib and Florian Alt.Exploring Emotions and Emotion Elicitation Techniques in Virtual Reality.In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany.[Download Bibtex]

Exploring how Saliency Affects Attention in Virtual Reality

We investigate how changes in the saliency of a Virtual Environment (VE) affect our visual attention during different tasks. In particular, we investigate if users are attracted to the most salient regions inthe VE. This knowledge will help researchers design optimal VR environments, purposefully direct the attention of users, and avoid unintentionaldistractions. We conducted a user study (N=30) where participants performed tasks (video watching, object stacking, visual search, waiting)with two different saliency conditions in the virtual environment. Ourfindings suggest that while participants notice the differences in saliency,their visual attention is not diverted towards the salient regions whenthey are performing tasks.

rivu2021interact3.jpg Radiah Rivu, Ville Mäkelä, Mariam Hassib, Yomna Abdelrahman and Florian Alt.Exploring how Saliency Affects Attention in Virtual Reality.In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany.[Download Bibtex]

When Friends become Strangers: Understanding the Influence of Avatar Gender On Interpersonal Distance Between Friends in Virtual Reality

In this paper, we investigate how mismatches between biological gender and avatar gender affect interpersonal distance (IPD) in
virtual reality (VR). An increasing number of VR experiences and online platforms like Rec Room and VRChat allow users to assume othergenders through customized avatars. While the effects of acquaintanceship and gender have been studied with regard to proxemic behavior,the effect of changed genders remains largely unexplored. We conducteda user study (N = 40, friends = 20, strangers = 20) where users played atwo-player collaborative game in Rec Room using both male and femaleavatars. We found that with swapped avatar genders, the preferred distance increased between friends but not between strangers. We discusshow our results can inform researchers and designers in the domain ofmulti-user VR.

rivu2021interact2.jpg Radiah Rivu, Yumeng Zhou, Robin Welsch, Ville Mäkelä and Florian Alt.When Friends become Strangers: Understanding the Influence of Avatar Gender On Interpersonal Distance Between Friends in Virtual Reality.In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany.[Download Bibtex]

Gaze-adaptive Information Access in AR: Empirical Study and Field-Deployment

This paper presents the results of an empirical study anda real-world deployment of a gaze-adaptive UI for Augmented Reality
(AR). AR introduces an attention dilemma between focusing on the reality vs. on AR content. Past work suggested eye gaze as a technique toopen information interfaces, however there is only little empirical work.We present an empirical study comparing gaze-adaptive to an always-oninterface in tasks that vary focus between reality and virtual content.Across tasks, we find most participants prefer the gaze-adaptive UI andfind it less distracting. When focusing on reality, the gaze UI is faster,perceived as easier and more intuitive. When focusing on virtual content,always-on is faster but user preferences are split. We conclude with thedesign and deployment of an interactive application ina public museum,demonstrating the promising potential in the real world.

piening2021interact.jpg Robin Piening, Ken Pfeuffer, Augusto Esteves, Tim Mittermeier, Sarah Prange, Philippe Schroeder and Florian Alt.Gaze-adaptive Information Access in AR: Empirical Study and Field-Deployment.In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany.[Download Bibtex]

Investigating User Perceptions Towards Wearable Mobile Electromyography

Wearables capture physiological user data, enabling novel userinterfaces that can identify users, adapt to the user state, and contribute tothe quantified self. At the same time, little is known about users’ perceptionof this new technology. In this paper, we present findings from a user study(N=36) in which participants used an electromyography (EMG) wearableand a visualization of data collected from EMG wearables. We found thatparticipants are highly unaware of what EMG data can reveal about them.
Allowing them to explore their physiological data makes them more reluctantto share this data. We conclude with deriving guidelines, to help designersof physiological data-based user interfaces to (a) protect users’ privacy, (b)better inform them, and (c) ultimately support the uptake of this technology.

prange2021interact.jpg Sarah Prange, Sven Mayer, Maria-Lena Bittl, Mariam Hassib and Florian Alt.Investigating User Perceptions Towards Wearable Mobile Electromyography.In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany.[Download Bibtex]

Passphrases Beat Thermal Attacks: Evaluating Text Input Characteristics Against Thermal Attacks on Laptops and Smartphones

We investigate the effectiveness of thermal attacks againstinput of text with different characteristics; we study text entry on asmartphone touchscreen and a laptop keyboard. First, we ran a study(N=25) to collect a dataset of thermal images of short words, websites,complex strings (special characters, numbers, letters), passphrases andwords with duplicate characters. Afterwards, 20 different participantsvisually inspected the thermal images to attempt to identify the textinput. We found that long and complex strings are less vulnerable tothermal attacks, that visual inspection of thermal images reveals differentparts of the entered text (36% on average and up to 82%) even if theattack is not fully successful, and that entering text on laptops is morevulnerable to thermal attacks than on smartphones. We conclude withthree learned lessons and recommendations to resist thermal attacks.

Yasmeen Abdrabou, Reem Hatem, Yomna Abdelrahman, Amr Elmougy and Mohamed Khamis.Passphrases Beat Thermal Attacks: Evaluating Text Input Characteristics Against Thermal Attacks on Laptops and Smartphones.In Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction. INTERACT '21. Springer, Berlin-Heidelberg, Germany.[Download Bibtex]