From Eagle Flight at Ubisoft to Meta Quest indie titles — I've been building immersive VR and MR experiences since the first consumer headsets shipped.
I have been building VR experiences professionally since 2016, when I worked on Eagle Flight — one of the first AAA launch titles for PlayStation VR. That project taught me the fundamentals of comfort-first VR design, high-performance rendering for headsets, and the unique challenges of testing and iterating on experiences that can only be evaluated with a headset on your head.
My VR development work covers the full stack: interaction system design (direct manipulation, ray casting, gaze-based UI), locomotion systems (teleportation, arm-swing, smooth locomotion with comfort vignette), physics-based interactions, spatial audio integration, and performance optimisation for the strict framerate requirements of VR (72, 90, or 120fps targets depending on platform).
I build for Meta Quest 2, 3, and Pro (standalone Android), PC VR via OpenXR (Steam, Meta Link), and PlayStation VR2. I use the Meta XR SDK (formerly Oculus Integration) and Unity's XR Interaction Toolkit depending on project requirements.
Meta Quest 3's full-colour passthrough opened Mixed Reality as a serious game design platform. I've worked with the Meta XR Scene Understanding API to build experiences that recognise and respond to the player's physical environment — spawning content on detected surfaces, implementing environment depth occlusion, and persisting spatial anchors across sessions.
MR design requires a different mindset than pure VR. You're designing for an environment you don't control, with real-world objects that interact with virtual content in ways you must anticipate. My approach to MR projects begins with a comfort and safety audit of the interaction design before a line of code is written, followed by extensive playtesting with users in diverse physical environments.
I'm experienced with passthrough rendering optimisation, dynamic lighting matching (adjusting virtual scene lighting to match detected real-world illumination), and the specific performance constraints of Quest 3's mixed reality layer.
VR has the most demanding performance requirements of any real-time platform. Missing your framerate target in flat games causes dropped frames. Missing it in VR causes nausea. The performance budget for a standalone Quest experience is brutally tight: less than 2ms CPU time per eye, specific polygon budgets per scene, and strict texture memory limits.
I offer dedicated VR performance audits for projects that are struggling to hit their target framerate. Using Unity's Frame Debugger, RenderDoc, and Meta's OVR Metrics Tool, I identify the specific bottlenecks — whether draw call overhead, shader complexity, overdraw, shadow rendering, or CPU-GPU synchronisation — and provide a prioritised remediation plan with effort and impact estimates for each item.
Common VR performance techniques I implement: GPU Instancing, Single Pass Instanced rendering, Fixed Foveated Rendering (FFR/ETFR), dynamic resolution, occlusion culling tuning, and LOD system implementation for complex scenes.
Frequently Asked Questions
Do you work with the Meta XR SDK or the XR Interaction Toolkit?
Both, depending on project requirements. For Quest-specific projects where I want access to the full platform feature set (hand tracking, Scene Understanding, Spatial Anchors, passthrough controls), I use the Meta XR SDK. For projects targeting multiple OpenXR platforms, XR Interaction Toolkit provides better portability. I've shipped projects with both and can advise on the right choice for your scope.
Can you help with an existing VR project that has performance problems?
Yes. Performance audits are one of my most commonly requested engagements. I typically need access to the Unity project source and a Quest device for profiling. The audit deliverable is a written report with specific findings and a prioritised fix list, which you can then implement yourself or engage me to implement.
Do you have experience with hand tracking?
Yes. Hand tracking on Meta Quest is a powerful interaction modality but requires careful design — it's more latency-sensitive and less reliable than controller input in complex environments. I've built hand-tracking interaction systems for Quest and can advise on when to use it versus controllers, and how to design graceful fallbacks.
What is your experience with PC VR vs standalone?
Both. PC VR (via Steam, OpenXR) allows significantly higher graphical fidelity and more complex scenes. Standalone Quest development requires aggressive optimisation but reaches a much larger audience. Many projects benefit from a single codebase with two build targets — I can architect that from the start to avoid duplication.
Last updated: March 2026