Immersive VR for Geospatial Visualization A Spatial Data Explorer for Meta Quest

Screenshot preview of Immersive VR for Geospatial Visualization,  A Spatial Data Explorer for Meta Quest

Introduction

A VR application for Meta Quest that lets urban planners and analysts explore layered geospatial data in 3D with hand-based controls, interactive UI, and real-world datasets.

Role

As the Lead UX & Interaction Designer and VR Developer for this project, I played the main role in developing the user interface. I also implemented the Unity prototype with hand-based interactions and integrated the ArcGIS Maps SDK to ensure smooth transition between scenes.

Course Name:
CMPT 496: Final Project
Timeline:
Jan 2025 – Apr 2025
Collaborator:
  • Keegan Vanstone

Skills:
  • UX Research & Testing

  • Interaction Design for VR

  • Visual Design

  • VR Development

  • Unity Scripting

Tech Stack:
  • C#

  • Unity

  • Unity DevOps

  • Meta XR Interaction Toolkit

  • ArcGIS Maps SDK for Unity

  • Figma

The Problem Space

Working with geospatial data often involves parsing complex, layered information all within tools designed primarily for 2D or 3D viewing. These tools are powerful but can be dense, cognitively demanding, and require several other tools for a simple task. For professionals like urban planners or analysts, making sense of spatial relationships across disconnected maps and dashboards isn’t just inefficient and it can obscure critical insights. This environment presents unique challenges for clarity, context, and communication.

Challenges

  1. 2D views limit spatial understanding

    Traditional map and dashboard tools force users to interpret complex geospatial data on flat screens, limiting depth perception and spatial reasoning.

  2. Multiple tools required for specific workflows

    Analysts frequently switch between multiple applications or browser tabs to access different data, leading to disjointed analysis.

  3. Difficulty interpreting layers without context

    Whether zoning, demographics, or infrastructure, professionals like urban planners, data scientists, and policymakers benefit from viewing multi-layered geospatial information in a shared 3D context.

  4. Underutilization of Immersive Technologies

    While 3D and VR technologies have been explored in other domains, a gap exists in applying immersive VR environments specifically to geospatial datasets.

Problem Areas


The Solution

Rather than relying on static, screen-based tools, we introduced a fully immersive VR experience tailored for spatial data exploration. Built in Unity and optimized for the Meta Quest 3, our solution emphasizes intuitive interaction and deeper comprehension through:

  1. Immersive 3D Environment with Natural Navigation

    Our solution immerses users in interactive 3D environments with intuitive hand-based navigation, making spatial data exploration more natural and insightful.

  2. Interactive Layer Management

    Users can toggle visibility, adjust opacity, and enable or disable specific data overlays through an accessible, in-world UI panel.

  3. Reduced Cognitive Load Through Immersive Exploration

    By placing users directly inside a three-dimensional map, the system leverages kinesthetic and spatial learning strategies to make complex spatial relationships easier to understand.

  4. Integration with ArcGIS for Real-World Data Fidelity

    The app uses the ArcGIS Maps SDK to bring in high-accuracy geographic data, enabling stakeholders to analyze real-world information in a realistic and interactive context.

Demo (Coming Soon)


Research Process

My teammate and I reviewed existing research and defined user needs through persona development to understand the existing issues with traditional spatial data viewing methods. This initial research helped clarify how traditional geospatial tools fall short, especially compared to more immersive technologies like VR.

User Statement

"As an urban planner, I want to intuitively explore layered geospatial data in a virtual environment to quickly understand spatial relationships and make informed planning decisions."

Personas

Portrait of Maya Rodriguez
Age:
35
Location:
Vancouver, BC
Occupation:
Municipal GIS Analyst

"I need to see how all the layers interact such as elevation, zoning, and infrastructure without flipping between ten different maps."

Goals:

  • Evaluate potential development zones with complete spatial context.
  • Compare zoning boundaries and building footprints in a single environment.
  • Communicate spatial insights clearly to non-technical stakeholders.

Pain Points:

  • Wastes time toggling between flat maps and layers in disconnected tools.
  • Struggles to convey spatial relationships to decision-makers using 2D diagrams.
  • Finds it difficult to explain spatial decisions to team members without visual aids.

Ideation & Design

Informed by our early research, technical constraints and problem definition, I led the ideation and interface design process, focusing on translating spatial data interactions into an intuitive, immersive VR experience. My work centred on creating a flow that would feel natural to both VR newcomers and experienced users.

UI Flow

Idea Exploration

Wireframing


Implementation

To bring our concept to life, we transformed our early design into a functional VR prototype using Unity and the ArcGIS Maps SDK. While my teammate focused on implementing locomotion, SDK integration and deployment, I also designed and implemented the interactive UI and interaction, ensuring a smooth and immersive user experience.

Development Process

  1. Interactive VR UI System

    I implemented the user interface, including toggles for data layer visibility, opacity sliders, and a toggleable legend panel. These tools allowed users to explore and manipulate geospatial data in real time.

  2. Interaction Scripting

    I contributed to scripting behaviours such as data layer switching and legend updates. I also played a role in adapting inherited code to fit our new data visualization goals.

  3. Scene-Based Data Exploration

    My teammate created the base environments using public datasets, while I ensured each city scene (Edmonton, Paris, Tokyo) supported seamless UI interaction and consistent user experience across locations.

Legend Development

Challenges & Solutions

  1. ArcGIS Data Constraints

    The free ArcGIS account limited us to using publicly available tile and scene layers. Attempts to upload and use custom datasets failed due to hosting restrictions. Solution: We adapted by sourcing quality public datasets, especially for cities like New York and Tokyo, for our prototype.

  2. UI Scripting & Layer Control

    Implementing runtime control of data layers via the UI required reverse-engineering some ArcGIS API code due to the lack of documentation. Solution: By analyzing the SDK scripts, we created custom functions to toggle visibility, adjust opacity, and switch data legends interactively.

  3. Version Control Limitations

    Unity's large file sizes quickly exceeded GitHub's storage limits, complicating collaboration. Solution: Our solution was to adopt Unity DevOps, which better handled the project's larger files and allowed smooth collaboration.

SDK Code Extracting


Testing & Feedback

To evaluate the usability and effectiveness of our VR prototype, we designed and conducted a structured user testing session. We aimed to assess how easily participants could navigate and understand spatial data in VR and whether the interactive controls were intuitive for all users.

Key Findings

  • "The layer toggles and opacity controls were effective for exploring spatial data."
  • "Scene switching between cities was smooth and well-received."
  • "Request for continuous turning instead of snap-turning for smoother navigation."
  • "Ascend/descend controls were missing, limiting full vertical movement."
  • "Scene boundaries were unclear, and some users accidentally flew below the map."

Final Prototype

Scenes (Edmonton, Paris, Tokyo, Rendered Buildings)

Data Layers

UI Components


Reflection

While we couldn't implement the changes from our testing, we got an insight into how we can improve both our design and development skills for VR. By embracing both the creative and practical challenges of VR development, we delivered a functional prototype that pushed our skills and demonstrated the viability of spatial data exploration in virtual environments.

  1. Shifting focus early led to a stronger product direction

    Pivoting from abstract datasets to geospatial data mid-project allowed us to ground the experience in real-world use cases and better align with user expectations for spatial understanding.

  2. Working within technical constraints taught resourceful problem-solving

    Limitations from the ArcGIS SDK, Unity version control, and VR build compatibility forced us to explore alternate tools and workflows, building our confidence in navigating platform-specific challenges.

  3. Collaboration and clear task division accelerated progress

    Splitting responsibilities between UI/UX design and development helped us iterate faster and maintain a steady development pace over the semester.


Other Case Studies