Stinkmoji was an internal project that started as a technical challenge: making animoji-like characters on the web, which are available on desktop and mobile to those without an iPhone X.
This project also celebrates some of pop culture’s most iconic characters. We chose our characters based on the team’s favorite shows and films from the past year — it was not an easy task!
The experience offers users the possibility to select and then play as one of the characters. Each part of the face (lips, eyelids, eyebrows, and cheeks) is tracked so the 3D characters perfectly mirror the user’s movements.
Giving life to the characters
Starting from sketches, we worked with Les Filles Du 9 Novembre to model our 3D characters and their expressions. We made the texturing on Substance and directly previewed the final rendering on the site in real-time.
We used blendshape-based facial rigging to create all of the facial expressions. The three characters all used the same set of blendshapes, allowing us to quickly link character models to our facial tracker. We create a custom ThreeJS material to support up to 12 blendshapes (6 normals and 6 vertices positions).
The characters automatically blink, giving them life even if the user is not moving. It’s also an answer to a technical limitation: blink detection was unstable and gave inconsistent results to users. We also added spring physics on the ears and hairs, emphasizing the user movements and giving a cartoonish feel.
In order to reinforce the link between the pop-culture reference and the created model, each of the character has its own secret world and sound design inspired by the original show or movie. The hidden feature is unveiled when users make a specific facial expression (like 😱). This also allowed us to add another level of interactivity and a playful feel inside the facial recognition experience.
We created our own face controller on top of the Beyond Reality Face tracker. This tracker has impressive performances, even on mobile. To give the best results on all type of faces, tracked features are computed from the user’s neutral face, scanned at the beginning of the experience.
The tracker uses CPU intensive algorithms and can lead to bad frame rate. We optimized this using an adaptive tracking frequency, going from 10fps to 40fps. This frequency automatically increases or decreases to get the best 3D scene frame rate on each device. We also added a dynamic linear interpolation for the characters animations: the movements will be more smooth when the tracking frequency is lower to avoid the feeling of “lagging” characters.
Implementing the tracker is an intensive task, so we lowered the cost of the 3D rendering using matcap instead of a complex lightning environment. Matcap was also a great way to quickly iterate on the lightning setup to find one that enhances the characters expressivity from the rendered lights and shadows. The character material has a special property to “switch” from one matcap to another, creating different atmospheres directly on the GPU and allowing the user to easily transition into each character’s secret world.
On desktop, the website provides two rendering qualities. The HD version adds high-definition diffuse textures, normal maps and high poly meshes generated from a client-side subdivision algorithm, giving more details to the characters. We also made a CLI tool to generate multi-format compressed textures for each devices (dxt, pvrtc, etc, plus a fallback to png/jpg textures).
From digital to the real world
After seeing the positive reception of the project online, we decided to 3D print one of the characters to live permanently in our Paris studio as a full-time Stinker!