Collaborating Across Realities: Analytical Lenses for Understanding Dyadic Collaboration in Transitional Interfaces
Best Paper
Allgemeines
Art der Publikation: Conference Paper
Veröffentlicht auf / in: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023
Jahr: 2023/04/19
Veröffentlichungsort: New York, NY, USA
Verlag (Publisher): ACM
DOI: https://doi.org/10.1145/3544548.3580879
Autoren
Niklas Peper
Anita Marie Hamurculu
Zusammenfassung
Transitional Interfaces are a yet underexplored, emerging class of cross-reality user interfaces that enable users to freely move along the reality-virtuality continuum during collaboration. To analyze and understand how such collaboration unfolds, we propose four analytical lenses derived from an exploratory study of transitional collaboration with 15 dyads. While solving a complex spatial optimization task, participants could freely switch between three contexts, each with diferent displays (desktop screens, tablet-based augmented reality, head-mounted virtual reality), input techniques (mouse, touch, handheld controllers), and visual representations (monoscopic and allocentric 2D/3D maps, stereoscopic egocentric views). Using the rich qualitative and quantitative data from our study, we evaluated participants’ perceptions of transitional collaboration and identifed commonalities and diferences between dyads. We then derived four lenses including metrics and visualizations to analyze key aspects of transitional collaboration: (1) place and distance, (2) temporal patterns, (3) group use of contexts, (4) individual use of contexts.