Giphy Keyboard - 3D Animated Emojis
3D Animation & Rendering // In partnership with Gunner
Role: Cinema 4D Lead. Modeling, Rigging, Animation, & Rendering in Redshift.
The team at Gunner approached me to help out with an absolutely massive number of animated emojis for use in the Giphy Keyboard App. Together, we came up with a workflow to create, animate & render more than 80 emojis in a little over a month. It was a ton of fun getting a chance to inject so much life into a subject most of us come in contact with on a daily basis, gifs! The essence of each emoji is partially on that of the user, so exploring how to interpret how each of them should behave was both a challenge & a delight. Many laughs were had throughout the office as each ‘moji was completed.
* Direction & 3D Animation: Ian Sigmon, 3D Animation: Billy Chitkin, Todd Hersey, Nick Parente, Collin Leix, John Hughes, Production Brandon Delis
Here are a number of my personal selects, of which I can take responsibility for, hah!
Together, Todd and I settled on a use of Cinema 4D R20’s volume builder system in order to create the bulk of the facial-based emojis. It mostly starts with a sphere to make the basic head shape of the emoji, and then using animated Splines to create the mouths, eyes, teeth etc. This volume builder workflow got quite heavy once everything was cranked up, but could be easily disabled in order to tweak the animation on the underlying splines which allowed us to maintain a quick pace & iterate. The bulk of our animations, which were all seamless loops, were done with the assistance of GSG Signal. This plugin gave us great flexibility to control multiple animated parameters at once, or to audition what a particular animation would look like at either 24, 48 or 60 frame lengths.
Another large task I was responsible for was the modeling & rigging of the hands. We knew that with so many hand-based emojis on our list, that a dedicated asset was needed.
A 4-finger & thumb hand as seen below, with an optional wrist was where I settled after receiving Simpson’s-inspired reference.
With such a big list of assets, many which passed through multiple artists and/or revisions, a good ole’ fashioned spreadsheet to track our progress was invaluable.
Using tokens to set rendered output paths into the right folder automagically made queuing up so many renders a breeze.
Here is the basic shading network I developed in Redshift. Using the Display Color attribute in the Color User Data node at the top of the node tree meant that even in our mixed Mac & PC environment, artists working on a machine without Redshift installed could prep a shot to be renderable merely by setting the Display Color of the object(s) in the scene.
Later, I expanded this shading network to add gradient controls which were tied to the world position of a null in order to create features like Blushing cheeks, coloration for Terror & Devil Horns, as well as the flames. This gradient setup originates from Merk Vilson and was instrumental, because an animated Volume Builder’s ever-changing topology meant UV mapping & traditional texturing methods was not possible.