Augmenting Auxiliary

A technical deep-dive on how Stink Studios worked with premium sneaker brand Auxiliary in partnership with Selfridges to create a first-of-its-kind retail experience using Spark AR

Stink Studios
11 min readOct 11, 2019

Matt Greenhalgh, Technical Director

If you haven’t seen the experience you can watch the film.

Just as the High Street seeks to respond to its much-publicised decline, so the new-wave of AR technologies seek usage models that will take them out of their current niche as fun, pervasive but often frivolous face filters. The paring represents a potentially lucrative marriage that could re-engage jaded high-street shoppers and Gen Z digital natives.

Augmented Reality, previously written off as a fun but high-barrier-to-entry novelty, is enjoying a renaissance. This is thanks, in no small part, to the Spark AR platform from Facebook. It is available on any device with either Instagram or Facebook installed, which makes it almost ubiquitous. Suddenly AR experiences can be initialised and experienced in a matter of seconds by pretty much anyone with a phone in their hand. Added to which, its IDE abstracts much of the complexity of face and target tracking leaving developers to concentrate on the creative process rather than wrestling with engineering fundamentals.

As an official Spark AR production partner we were invited to this year’s F8 conference. With Facebook’s highlighted AR growth area of ‘Retail’ very much in mind, we were delighted to be approached by Auxiliary to create an AR experience in-store at one of London’s premium shopping destinations, Selfridges.

Auxiliary’s aesthetic inspiration lies in the analogue world of 20th Century Hi-fi separates, Walkmans and Technics record decks. With a nod to the Brand’s tag line — ‘Think inside the box’ — we imagined an AR experience that would allow customers to glimpse inside a box full of these inspirations and discover them in surprising and even unsettling ways. We wanted the experience to transition seamlessly from the physical form of the display table, with its full complement of Auxiliary sneakers, into an expanded world that interacted with the surface as though one continuous structure. To do so we needed to find a way to create the kind of animations you might associate with a Unity or Unreal Engine native app but within the 2MB-per-platform asset file size limit Spark AR requires.

3D modelling and texturing

Work began by modelling the full range of Auxiliary’s 2019 range of sneakers, eight in total. We wanted the 3D models to to do the craftsmanship of their physical counterparts justice but to also work within the paired-back graphical styling we had in mind for the experience. We were fortunate inasmuch as the sneakers all shared the same sole unit. This meant we could share a single geometry instance across the range. There was also only four variations in the upper shape with individual variation coming from materials used. This meant that we could use textures in place of geometry to convey a lot of each individual sneaker’s characteristics.

The fully textured Anti-Skate model

Texturing was undertaken in Substance Painter. It’s material library used as a base to allow rapid texturing of each shoe. We intended to take advantage of Spark AR’s Physically-based material and rendering capabilities so we created custom exporters for the PBR textures, pre-packing Occlusion-Roughness-Metallic data as RGB channels, ready for ingestion by Spark AR.

In addition to the shoes, we knew we wanted to create a number of animated 3D elements to represent the inspiration within the AR scene. However, with the shoe models and textures consuming the lion’s share of the available filesize budget we had to find inventive ways to animate the 3D assets in the scene. Exporting baked animation was not going to be possible so, wherever possible, we chose to export static assets only and animate them within Spark AR itself.

One example was the audio cable that snakes out of the gloom at the base of ‘the trench’ (as it became to be known) before looming up in front of peoples’ eyes and plugging in to the side of the virtual walls. We baked the motion path of the jack element as an animated fbx but the trailing cable was actually a static asset. We carefully unwrapped it’s UVs so they ran left to right with proportional spacing and then animated the texture in Spark AR Studio so that it moved from left to right across the UV map. By matching the animation easing curves in Spark AR Studio with the exported jack’s animation easing curves in Blender we were able to create the effect of the cable animating through space, but with a fraction of the file size requirements.

Spark AR Physical Lighting

One challenge that came to light during early testing of the sneakers’ materials was a curious bug in Spark AR’s lighting implementation. Spark AR is in constant, ongoing development. It sees a new feature release on a fortnightly basis and, as such, is effectively in open Beta. The pace of development is impressive but bugs are an expected feature from time to time. This one was fairly significant however…

PBR materials generally depend on Environment Maps to get best results. These are images that capture surrounding lighting information and are typically stored in Cube Maps or Equirectangular projections. To account for the reflective behaviour of both metallic and non-metallic (dielectric) objects these reflection maps need to be stored in two formats. One is called a Specular Radiance map and accounts for the glossy specular reflections you see in metallic and polished non-metallic materials. It is usually stored at multiple levels of both resolution and blurriness in a cascading image called a MipMap to allow for the appearance of differing levels of roughness in materials.

The second storage format is called a Diffuse Irradiance map and this accounts for the total contribution of light hitting a surface point based on a sampling hemisphere oriented about the surface’s Normal (a Vector perpendicular to its surface).

Ok, so two textures that work together in a physical rendering context to simulate the behaviour of light’s interaction with a surface’s material properties. Except, at the time of writing and for the duration of the project, they didn’t work together in Spark AR. The two textures’ orientation was misaligned internally resulting in lighting appearing to come from different directions depending on the object’s material!

Debugging Spark AR’s Physical Material Environment Reflection, note the Specular and Diffuse inversion

Fixing this shortcoming in Spark AR was no easy task but the development environment — Spark AR Studio — provides enough low-level shader access to attempt a fix. We built a completely custom Image Based Lighting solution entirely in Spark AR’s Patch Graph to replace it’s internal lighting model. The result remained performant while addressing the lighting inversions in the internal lighting model and meant we could take advantage of the PBR materials we had created for the project. We look forward to a fix to this bug keenly however.

Our Custom Image Based Lighting Patch

Signed Distance Fields

With the exception of the lighting model, other features of Spark AR’s IDE allowed for rapid prototyping and development. One great feature, that we were happy to take advantage of, was its support for 2D Signed Distance Fields.

Signed Distance Fields are mathematical descriptions of geometric primitives: shapes like ellipses, rectangles and stars. They can be combined, blended or repeated in ways that are very easy to define and efficient to render. We were able to take advantage of them in a number of areas where typically we would have had to use costly textures or animated PNG sequences. They allowed us to limit the textures for the entire project to just the sneakers’ materials and some single text instances for the cascading text effects.

The entire Table Surface animation

Text animation

The experience features a number of animations of layers of text, wrapping and snaking around its virtual walls, creating a striking display in the shoe hall. Spark AR includes support for both 3D and 2D text but neither of these native implementations were especially suited to the kind of multi-line animation we wanted to create for the experience. We instead opted to use a single texture instance of the word and use a custom UV animation, scaling and repetition patch to create the desired effect. This was then composited in to the other elements on the wall surface to create a single combined material instance that could be applied to just one copy of the wall’s geometry rather than requiring multiple layered copies of the model.

Cascading text using UV offsets and animation
The Patch Graph used to achieve the effect

One downside of using a texture for the text elements was Spark AR’s resolution limit of 1024 square. We investigated using SVGs as an alternative but, at the time of development, Spark AR’s SVG rendering did not respect depth read or writes so couldn’t be integrated into the scene. A potential alternative (that came too late to be integrated into the final project) was the use of Multichannel Signed Distance Fields that allow crisp text to be displayed at any size from a small technical texture source. We’ll be using those next time.

Environment Rendering

To make the most of our custom Image Based Lighting solution we modelled and rendered a basic interior 360 of Selfridges shoe hall. It misses lots of detail but includes the broad colour areas of the walls and floor as well as representative placement of the ceiling spots. Spark AR Studio supports the import of HDR images but due to our custom implementation we needed to separate the Specular Radiance MipMaps and Diffuse Irradiance images. We used cmftStudio to generate these images ready for integration with our Lighting patch graph.

Specular Radiance MipMap cascade used to provide environment reflections

Target tracking

Our experience would be relatively unusual for a Spark AR project as it was a world-based effect that relied on an image marker to orient and initialise the scene. Creating image markers that provide a solid reference for an AR scene is something of a black art. Vuforia provide a handy online evaluation tool that gives a ranking for candidate images. There is no guarantee however that it’s evaluation system is the same as Spark AR or any other tracking technology.

It took a fair amount of iteration for us to arrive at a reliable image marker. Lots of common sense principles don’t apply in the way you might assume they would. We had a large potential surface for our image marker. The display table on which the sneakers were placed measured 3360mm x 1040mm. Ample space for sufficient detail we thought. In tests, the full size tracker was actually too large to comfortably fit in the field of view of most smart phones at a usable distance. We saw better results by using an interior crop of the image as the actual marker data. We found that once the scene had been acquired, Spark AR took advantage of the SLAM capabilities of ARKit and ARCore to continue to track the scene, even if considerable portions of the marker were no longer in view.

We modelled the Table Surface and tested a variety of camera aspect ratios at a variety of heights and distances to simulate the range of Tracker visibility

A further puzzling issue was that, although we had lots of randomly distributed, high-contrast feature points which we knew were the key requirement for tracker images, we still weren’t seeing rock-solid tracking. It transpires that our minimalist black and white design for the table surface was in fact a worse case scenario. AR tracking technology also likes to see an even distribution across the image’s luminance histogram. Our black and white image had peaks at either end of the histogram with almost nothing in between. By adding in some gradient strips we were able to introduce this luminance variation and instantly saw a marked improvement in the marker acquisition and scene stability.

The Marker Image Luminance Histogram, without (left) and with (right) gradients included

The final consideration for the image marker was the print material itself. Given our marker was going to be placed in a high-traffic environment like Selfridges we opted for a matte vinyl surface that would stand up to some wear and tear. A polished reflective surface would have allowed too many reflections to obscure the tracker image, preventing scene initialisation. In tests in our office, the vinyl material held up really well in a variety of lighting conditions. When we installed it in Selfridges, the spots used to illuminate the shoes on display were so bright that the matte surface actually spread the light reflection over a wider area. We were able to make some adjustments to the light intensity but this was one of the big lessons learned from the real world of retail environments. For future installations we would investigate polarising film filters and other material finishes to address these kind of issues. Indeed, as AR grows to coexist beside it’s physical neighbours there will be a greater need for low reflectivity materials and lighting environments that suit both product and AR experience layers.

Try it yourself

You can try a small scale version of the experience by opening this link or scanning this QR code with your mobile camera:

If you have the latest version of Facebook you can then point your Facebook camera at this small version of the image marker:

Enjoy the teeny tiny sneakers but try to imagine it filling Selfridges shoe hall

The future of retail?

Using Spark AR to bring digital storytelling to the retail environment was a positive as well as instructive experience. There were undeniable challenges: file size constraints, the physical retail environment and working with software at the cutting edge of development makes it difficult to create tightly integrated digital content. However these challenges are temporary: with the arrival of 5G connectivity, greater ubiquity of AR content in experiential marketing and the progressive development of platforms like Spark AR it will only become easier to create this kind of content and create rich, layered experiences in retail environments.

Stink Studios look forward to continuing to lead the charge in this exciting new field. If you would like to create a Spark AR filter for your Product or Brand you can contact us at sparkar@stinkstudios.com.

--

--

Stink Studios

A creative advertising and digital experience company.