← Back

December 2025

Horizon Events — VR Experiences at Meta

Building the game engine and orchestration layer for immersive VR concerts on Meta's Horizon platform.

Horizon Events lets up to 1,000 people attend a live concert together in VR. The experience needs to feel seamless — lighting reacts to the music, effects fire on cue, and all of it has to stay synchronized across a thousand headsets with different network conditions.

The Rendering Engine

The core of the experience runs on a C++ game engine. I worked on the real-time lighting and visual effects systems — the parts that make a VR concert feel like a concert and not a tech demo. This meant writing rendering code that had to hit frame-rate targets on Quest hardware while supporting dynamic lighting changes driven by live audio.

The constraint that makes VR rendering different from traditional games: you're rendering two views (one per eye) at 72-90Hz with a hard latency budget. Miss a frame and the user feels it physically. There's no room for hitches.

The Orchestration Layer

On top of the engine sits a TypeScript orchestration layer that coordinates the show. Think of it as the stage manager: it knows the setlist, triggers lighting cues, manages transitions between songs, and handles the state machine of a live event (pre-show, live, intermission, encore, post-show).

The challenge was making this scriptable enough that a creative team could author experiences without touching C++, while keeping it deterministic enough that 1,000 clients all see the same thing at the same time.

What I Took Away

This was my first time working at the intersection of systems engineering and creative tooling. The technical problems (frame budgets, synchronization, state management) were familiar. What was new was designing APIs for non-engineers — making the system expressive enough for creative intent while keeping the implementation deterministic.