Apple’s iPhone Pro LiDAR

A small feature with big potential for customer experience

Stink Studios
4 min readOct 16, 2020

Matt Greenhalgh, Executive Technical Director

iPhone 12 Pro LiDAR Sensor [Image credit: Apple]

If you were following along with Apple’s recent announcements you may not have picked up on the significance of one of the new features of the iPhone Pro. Billed as primarily an enhancement to photographic capabilities, LiDAR will also dramatically enhance Augmented Reality capabilities on devices.

LiDAR stands for ‘Light Detection And Ranging’. It is a sensor that detects the time it takes for light rays to be transmitted from the device and reflected back to the sensor. The fractional differences resulting from the rays travelling different distances can then be used to calculate the depth of the world around the device, effectively building a 3D model of the surrounding geometry, in real-time.

LiDAR is not new technology. It first saw military use in the early 1960s for targeting. It came to public attention during the Apollo 15 Mission where it was used to map the surface of the moon. Today it can be found in everything from self-driving cars to automated vacuum cleaners. It’s also not new to computing devices or Smart phones. There are in fact a family of similar technologies that allow for depth mapping a scene. Microsoft Kinect is perhaps the most well known of these but this, and similar devices from Intel RealSense, have existed for a decade now. Google has also played a part through its Project Tango initiative that stalled and ultimately morphed into it’s AR Core alternative to Apple’s ARKit. Companies like 6D.AI (now Niantic) have driven incredible advances in software-only world mapping.

At Stink Studios we’re also no strangers to the benefits of LiDAR. We used a professional LiDAR scanner to scan the interior of Abbey Road Studios for Google in order to faithfully capture every nook and cranny of the famous interior. The resulting Point Cloud — a vast collection of points in space marking the location and colour of each reflected laser ray — was used to generate a centimetre-accurate virtual recreation of each studio space. The LiDAR scanner we used stood shoulder-height to its operator. Apple has now managed to shrink this capability down to a format we can carry around in our pocket.

Our raw point cloud data generated from a LiDAR scan of Abbey Road Studios

And here’s the significance: everyone who can afford the iPhone Pro price tag now has the ability to map their immediate surroundings with incredible accuracy. This dramatically increases the potential for Augmented Reality technologies to layer new content, in more realistic ways, into the world that surrounds us.

At Stink Studios we’ve been leading the field of Augmented Reality innovation through our partnership with Instagram as a Spark AR Beta launch partner. Instagram, like Snapchat, is best-known for the thousands of face-filter effects that grace Instagram Stories. Fun, but often frivolous, these effects have already explored the limits of ways in which we can adorn peoples’ faces. However, lesser known, and less explored usage scenarios, are back-camera based filters that project AR content into the world that surrounds you.

There’s a reason these back camera effects have been less popular. The front-camera selfie-based effects have very accurate 3D model data of the users face to work with. This 3D mesh, that follows the changes expressions and movement of the users head, allows for textures and effects to be accurately applied to the face surface. To date, native Smart phone capabilities for back-camera information about the world have been limited to ‘I can see a flat horizontal plane’ and more recently ‘I can see a vertical plane’ and that’s about it. As a result, world-based effects have had to use a lot of guesswork when placing objects into the world.

By way of example, in May this year we recreated Lara-Jean’s much-loved bedroom from Netflix show ‘To All The Boys I’ve Loved Before’. We used this environment as a virtual dressing room in which to place the H&M clothing launched to tie-in with the release of the film’s sequel. An opportunity to see the clothes in an inspiring environment just when getting to the shops was impossible. Our recreation was a faithful remodelling of the set from the film but we faced a challenge launching it in people’s environments. We knew nothing about the space they were in, so couldn’t allow them to walk around the room naturally as they may have collided with real-world walls or furniture. Instead we allowed them to move a camera around the room as a natural but still less-than-ideal solution.

Our AR recreation of Lara-Jean’s bedroom from ‘To All The Boys I’ve Loved Before’ for H&M

With the new LiDAR capabilities we could not only have determined exactly how big the natural environment was but also where potential collision objects were in order to guide placement. But we could have gone one step further still: rather than bring an identical static model into the user’s world we could have repainted their walls to match Lara-Jean’s, placed the clothes and props from the set around on the surfaces, even replaced chairs with models from the set. An altogether more magical transformation.

We’re hugely excited about the potential of this new LiDAR technology and as one of only a handful of agencies and creators enrolled onto the new Spark Partner Network with Facebook we’re excited to continue leading the field of AR innovation. It will take some time for the premium capabilities of the iPhone Pro’s LiDAR to filter down to mass market adoption but it is now a question of when, not if. With Retail and Experiential consumer engagements facing an uncertain future, the ability to bring your product to the user, albeit virtually, has never been more significant.

If you’d like to talk to us about creating an AR experience you can contact sparkar@stinkstudios.com

--

--

Stink Studios

A creative advertising and digital experience company.