Ground Control: Leveraging the User's Spatial Position as an Input Modality in an Embodied Immersive Analysis Use Case

Abstract

Extended Reality (XR) has already enabled sophisticated implementations of immersive visualizations, providing a more intuitive and engaging way of analyzing data. Yet, the user interaction with such immersive visualizations remains challenging, often relying on hand tracking or additional devices. We introduce a novel XR prototype that leverages the concept of embodied exploration, allowing users to interact with an exemplary visualization directly through their spatial position within the room relative to the displayed data. This approach eliminates the need for handheld controllers, offering a more intuitive engagement with the visualization. Our preliminary evaluation with twelve participants reveals a general preference for using XR for immersive visualizations compared to PC and paper-based versions. We suggest further research into non-standard interaction and exploration modalities for data analysis applications using XR, potentially offering new possibilities for engaging interactions with data.

Publication
In Proceedings of Mensch und Computer 2024 (MuC ‘24)
Katja Chen
Katja Chen
Student