Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
MCI-Paper01: Physiological Sensing and Virtual Reality
Time:
Monday, 02/Sept/2024:
11:00am - 12:30pm

Session Chair: Florian Michahelles
Location: 30.95 Audimax


Show help for 'Increase or decrease the abstract text size'
Presentations

HiveFive360: Extending the VR Gaze Guidance Technique HiveFive to Highlight Out-Of-FOV Targets

Sophie Kergaßner1, Nina Doerr2, Markus Wieland2, Martin Fuchs1, Michael Sedlmair2

1Hochschule der Medien; 2University of Stuttgart

Modern display technologies, particularly those supporting 360° content, are increasingly used for immersive experiences in a variety of domains. However, information outside of the user’s field of view (FOV) may be easily overlooked. To address this, guiding cues can be provided to effectively direct attention. Subtle and diegetic cues are particularly effective in keeping the coherence and immersion of the presented content. HiveFive is one of the few diegetic highlighting techniques. It effectively highlights objects by attracting the user’s attention with swarm-like motion. However, HiveFive is restricted to in-FOV target highlighting. This work presents the novel technique HiveFive360, an extension of HiveFive that enables it to guide users to out-of-FOV targets. HiveFive360 is evaluated in a user study against FlyingARrow and Subtle Gaze Direction VR regarding completion time, sense of presence and task load. HiveFive360 was found to effectively guide users in various environments without excessive distraction or task load.



Multimodal Detection of External and Internal Attention in Virtual Reality using EEG and Eye Tracking Features

Xingyu Long, Sven Mayer, Francesco Chiossi

LMU Munich

Future VR environments will sense users' context, enabling a wide range of intelligent interactions, thus enabling diverse applications and improving usability through attention-aware VR systems. However, attention-aware VR systems based on EEG data suffer from long training periods, hindering generalizability and widespread adoption. At the same time, there remains a gap in research regarding which physiological features (EEG and eye tracking) are most effective for decoding attention direction in the VR paradigm. We addressed this issue by evaluating several classification models using EEG and eye tracking data. We recorded that training data simultaneously during tasks that required internal attention in an N-Back task or external attention allocation in Visual Monitoring. We used linear and deep learning models to compare classification performance under several uni- and multimodal feature sets alongside different window sizes. Our results indicate that multimodal features improve prediction for classical and modern classification models. We discuss approaches to assess the importance of physiological features and achieve automatic, robust, and individualized feature selection.



Physiological and Perceptual Effects of Avatars' Muscularity while Rowing in Virtual Reality

Martin Kocur1, Thomas Noack3, Valentin Schwind2

1University of Applied Sciences Upper Austria; 2Frankfurt University of Applied Sciences; 3University of Regensburg

Virtual reality enables embodying different avatars. Coined the Proteus effect, previous work found that the visual characteristics of an avatar can cause behavioral, attitudinal, and perceptual effects. Recent work suggests that avatars' muscularity can even have physiological effects while cycling in virtual reality. As the effects have not been replicated it is, however, unclear how robust they are and if effects are limited to specific activities, such as cycling. Therefore, we conducted a study to understand if avatars' muscularity also causes physiological and perceptual effects for other tasks and if the effects can be replicated. 16 participants embodied a muscular and a non-muscular avatar while rowing on an indoor rower. We found that over time participants' heart rates increased significantly slower when embodying a muscular avatar compared to a non-muscular avatar. While not significant, descriptive statistics suggest the same trend for perceived exertion. Overall, the results confirm previous findings and support the conclusion that avatars can cause physiological effects for a range of physical activities.



UnitEye: Introducing a User-Friendly Plugin to Democratize Eye Tracking Technology in Unity Environments

Tobias Wagner, Mark Colley, Daniel Breckel, Michael Kösel, Enriko Rukzio

Ulm University

Eye tracking is a powerful tool for analyzing visual attention, as an input technique, or for diagnosing disorders. However, eye tracking hardware is expensive and not accessible to everyone, thus, considerably limiting real-world usage or at-home evaluations. Although webcam-based eye tracking is feasible due to advances in computer vision, its open-source implementation as an easy-to-use tool is lacking. We implemented UnitEye, a Unity plugin enabling eye tracking on desktop and laptop computers. In a technical evaluation (N=12), we tested the precision and accuracy of our system compared to a state-of-the-art eye tracker. We also evaluated the usability of UnitEye with N=5 developers. The results confirm that our system provides reliable eye tracking performance for a webcam-based system and well-integrated features contributing to ease of use.



[Invited Talk] MobileGravity: Mobile Simulation of a High Range of Weight in Virtual Reality

Alexander Kalus, Johannes Klein, Tien-Julian Ho, Lee-Ann Seegets, Niels Henze

University of Regensburg, Germany

Simulating accurate weight forces in Virtual Reality (VR) is an unsolved challenge. Therefore, providing real weight sensations by transferring liquid mass has emerged as a promising approach. However, key objectives conceptually interfere with each other. In particular, previous designs that support a high range of weight or high flow rate lack mobility. In this work, we present MobileGravity, a system, that decouples the weight-changing object from the liquid supply and the pump. It enables weight changes of up to 1 kg at a rate of 235 g/s and allows the user to walk around freely. Through a study with 30 participants, we show that the system enables users to perceive the weight of different virtual objects and enhances realism, as well as enjoyment.



[Remote] Statistical Analysis of Eye Movement Data for Beginners

Lisa Grabinger, Jürgen Horst Mottok

OTH Regensburg

Data processing and (statistical) data analysis are important tasks in empirical research. However, they present a particular hurdle for beginners. For one, they require knowledge of statistical methods, their prerequisites, or use cases; For another, one needs either programming skills or some software system to carry out the analyses efficiently. Empirical eye tracking research poses a further hurdle; Data from an eye tracker is processed more elaborately and usually merged with data from other sources (e.g., questionnaires). In this article, we take a closer look at the possibilities that prospective eye tracking researchers have on their way from data collection to publication-ready analysis. We show that there is currently no software system that allows valid statistical analyses of eye tracking data to be performed without prior knowledge - which means that prospective eye tracking researchers need to learn or be taught the basics before performing actual analyses. As a solution, we present a novel tool: eyenalyzer. It guides through the analysis process - even without prior knowledge and therefore suitable for beginners. In the article, after highlighting the need for the tool, we discuss its development, give a glimpse at the user interface, and point out contribution and future work.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: MuC 2024
Conference Software: ConfTool Pro 2.8.105+TC
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany