Magic Forest — The First LiDAR AR Experience on Snapchat

Designing for a technology that didn't exist in users' hands yet

The context

When Apple was preparing to announce the first iPhone with LiDAR, they reached out to Snap. I was assigned as the experience lead, partnering closely with the engineering team. The brief was straightforward in scope and extreme in deadline: design and ship a flagship AR experience that demonstrated what LiDAR could do — in approximately six weeks.

Two design challenges

The initialization window

LiDAR on the first iPhone took 3–5 seconds to initialize — to scan the environment and return usable surface data. That's a long time to show a user nothing. The experience needed to be designed around that wait, not despite it.

The complexity

This was unlike anything we had built before. The experience used LiDAR surface detection, ARKit surface classification across multiple object classes, body tracking, tap interactions, procedural shader-based grass and sky, and a full cast of 3D assets — each with its own reveal animation — all packed into an 8MB file. We had never used that many distinct assets simultaneously, each requiring 3D modeling, texturing, shading, and animation.

The concept

We developed three directions and presented them to Apple. Magic forest was my recommendation — and Apple's pick.

The concept solved the initialization problem elegantly: hummingbirds. During the 3–5 second LiDAR initialization window, virtual hummingbirds hover near the camera — alive, present, something to look at. As LiDAR data arrives and surfaces are detected, the birds fly to those surfaces and reveal AR plants where they land. The wait becomes part of the experience. The birds are the bridge.

What made this work beyond the visual: we never needed to interrupt the experience with onboarding text or instructional dialogs telling users to look around and walk around. The birds did that job naturally. Users followed the birds' flight with their camera — scanning more of the environment, spawning more content around them. The instruction was built into the design.

Surface classification from ARKit drove what appeared where: grass and lilies on floors, succulents on tables, different plants on chairs, vines climbing walls, a night sky covering ceilings. When a person entered the frame, a hummingbird would fly to them and hover beside their shoulder. Tapping any surface revealed a flower — surface-normal-aware, growing upward from horizontal surfaces, outward from vertical ones.

Magic Forest experience — in use
How we built it

Given the timeline and complexity, I structured development into two parallel streams. The engineering team began building the spawning logic, bird behavior, and interaction systems using simple primitives — boxes standing in for the final assets. Simultaneously, the 3D and VFX teams crafted the actual assets. The two streams converged progressively as assets were integrated into the live build.

The grass and night sky were technically novel for us at the time — procedural, shader-based, and animated. Kurt Kaminski from the engineering team built the shaders, producing grass with realistic movement and a night sky with feathered edges that felt like it belonged in the scene. The optimization work was equally impressive: the entire experience — all assets, all logic — shipped in an 8MB file.

Throughout the build, we had weekly syncs with the Apple team where we were expected to bring working builds. The experience evolved in public, week by week.

Parallel build streams — engineering primitives + 3D/VFX assets
The outcome

Magic Forest was featured on the Apple keynote at the iPhone 12 launch — seen by 61 million viewers on YouTube alone. It was a landmark moment for Snap's AR platform and for LiDAR as a consumer technology.

Years later, it still holds up. Snap has shipped many LiDAR experiences since — but none have pushed the technology further than this one did at launch. I'm still proud of it.

Apple keynote feature — iPhone 12 launch