XR Development Tools


Internal tools at Moving Labs
UX + interaction designer
‍Twelve weeks (part-time)


This project was part of an internal effort at Moving Labs to build our own tools for XR development. We hoped to address our own needs first and then release them for others to build on top of.

The Camera

Although the design was never produced, these high-fidelity mockups explore styling, typography, and overall proportions.

Requirements from the future

Since we were a new team and hadn't worked together on XR development projects before, we couldn't look to an existing workflow to help us define our requirements. Creating three rough proto-personas helped us group and prioritize features from our exploration according to audiences and their potential motivations.

  • Designers on our team who want to communicate with developers about UI implementation issues
  • Designers on our team who want to document an overall experience to present to the wider company or clients
  • Other XR creators using our tools to help build their own experiences

We focused on the first persona, where designers would pick up a Github ticket, evaluate the UI implementation in a build, and respond with notes and annotations. For an MVP, we focused in on a camera that could save out images for manual posting to tickets rather than a full pipeline in and out of Github:

  • Begin in 1st person POV mode
  • POV and floating or tripod view modes
  • User can review photos and videos from the device
  • Photo and still recording

Gathering references

Playing with existing cameras in VR applications helped us visualize possibilities and problems. One feature that stood out was the prevalence of a floating preview screen, which provided excellent feedback about camera state and generally that it was a camera. However, it could also get in the way sometimes.


I realized that our bug reporting situation would require me to have my hands engaged with whatever UI elements I was documenting. I created a design that could collapse to a floating button, allowing for unobtrusive start/stop of video capture while operating the UI.

Given its simplicity, the minimized design also lent itself to being quickly grabbed and squeezed for hand-tracking input, much like a shutter release button. This was great for taking multiple stills in series.

We anticipated mostly needing a first person view of the scene but also thought of some instances where walking through trigger volumes or other larger-scale interactions might need to be documented from an external view. I sketched a virtual drone (in the style of Snap's IRL Pixy product) that indicated camera position in 3d space and was an affordance for repositioning the camera.

Gesture: pull closer to zoom

Creating a testing a Unity prototype of the camera interface led to the realization that you would naturally bring the camera closer when trying to inspect recent captures. I did some quick motion sketches to illustrate the way we might recognize this interaction and resize the preview pane within the interface to better suit its intent.

I created this prototype with rough versions of capture image and floating camera view features.

VRChat as a testing platform

I did our first initial tests using VRChat as a quick and sharable prototyping platform. By using VRChat, we could easily generate a link to share between team members, joining a prototype space alone or together.

Having VRChat’s prefabs and pickup system ready to go also helped us uncover some new ideas through playful interactions, like this one below where a follow camera tracks an avatar as they run around drawing lines with a pen.