DashSpace: A Live Collaborative Platform for Immersive and Ubiquitous Analytics
Research output: Contribution to journal › Article › peer-review
Electronic versions
Documents
- Borowsky-et-al-TVCG-2025
Accepted author manuscript, 4.39 MB, PDF document
Licence: CC BY Show licence
We introduce DashSpace, a live collaborative immersive and ubiquitous analytics (IA/UA) platform designed for handheld and head-mounted Augmented/Extended Reality (AR/XR) implemented using WebXR and open standards. To bridge the gap between existing web-based visualizations and the immersive analytics setting, DashSpace supports visualizing both legacy D3 and Vega-Lite visualizations on 2D planes, and extruding Vega-Lite specifications into 2.5D. It also supports fully 3D visual representations using the Optomancy grammar. To facilitate authoring new visualizations in immersive XR, the platform provides a visual authoring mechanism where the user groups specification snippets to construct visualizations dynamically. The approach is fully persistent and collaborative, allowing multiple participants—whose presence is shown using 3D avatars and webcam feeds—to interact with the shared space synchronously, both co-located and remotely. We present three examples of DashSpace in action: immersive data analysis in 3D space, synchronous collaboration, and immersive data presentations.
Keywords
- web-based technologies, collaborative visualization, augmented reality, extended reality
Original language | English |
---|---|
Journal | IEEE Transactions on visualization and computer graphics |
Publication status | Accepted/In press - 22 Jan 2025 |
Total downloads
No data available