Advanced Lip-synching

We will show you how to use effects such as synchronizing multiple expressions to one single mouth element, and adding and removing shapes for sounds that are not recognized correctly by the automated lip-sync tool.


Accurate Lip-Synching is an important step towards making a convincing animated character. Toon Boom Studio offers this amazing tool that allows you to automatically synchronize a sound to a character’s mouth element. But why limit it to only that? In this article we will introduce you to some advanced lip-synching techniques you can use to produce results that will amaze your audience. We will show you how to use effects such as synchronizing multiple expressions to one single mouth element, and adding and removing shapes for sounds that are not recognized correctly by the automated lip-sync tool.

First we will do a quick overview of the lip-synching function. To lip-synch an element you will first need to have a sound and a drawing element containing the according mouth shapes. Once you have imported the sound into the application you will need to enable lip-synching by going to the edit sound window and pressing the lip-sync button after selecting the sound track. Then simply right-click in the header of the sound element to get the Lip Sync Mapping feature.


You might wish to use sound tracks with multiple feelings expressed in them and having only one shape to synchronize to the sound might not be enough. For example, the mouth shape B would look different if the character was sad. Therefore you might need to add several mouth shapes for one sound in the same element to fit the different expressions needed.

Once you have those extra shapes done, there is a trick to get them synchronized to the sound without having to use a whole new element. The first thing to do, is to create a clone of your mouth element. This clone will be used to temporarily synchronize the sound. Make sure that this element is the top layer of your animation to prevent any confusion later on. Once the clone is done, you will need to open the Modify Lip Sync Mapping function on your sound. Make sure that your target element is the first mouth element on top. Then, assign your mouth with the first expression to the element. When done, you will see that the clone element has been synchronized to the sound. Now you need to turn on the sound playback in the Play top menu and afterwards turn on the sound scrubbing. These functions will help you to define what the range of a certain expression is. Once determined, copy the exposure range from the clone to the original mouth element. Now the only thing left to do is to repeat the same process by making sure to copy the range that was not yet mapped. Once you have finished mapping the whole sound you can delete the cloned element as it will no longer be used.


It is now time to fine tune the lip synchronization. First you need to see how accurately your animation is synchronized to the sound. Making a full exportation of the movie is the most accurate way to do so. From time to time, you might notice that your mouth seems to be moving too much in certain words or sentences. This is basically due to the fact that the lip-synching function recognizes every variation in the sound and some of the sound that gets picked up should not necessarily be a new shape but might be a transition shape from one mouth to another. One great way to know if the mouths that have been generated are correct is by saying the text yourself. If you pay close attention, you will notice that some of the shapes that have been mapped might not need to be there. After identifying those mistakes, you can easily fix them by using the cell swapper from the cell tab of the properties window. You might also figure out that some sounds might need some extra mouth shapes for the lip-synching to be really convincing. Once you find out where you need to add those extra mouths, you can simply input a new mouth name in the exposure sheet in those locations and draw them directly in there.

The last thing to do would be to export your animation and see if everything is fine. If you are to export to Flash, an important thing would be to trigger on the Streamed checkbox in the middle right of the sound editor to make sure that your sound keeps its synchronization with the drawing. Since the Flash format is loaded progressively as the animation is played, your image might take more time than your sound to load and you will notice inconsistencies in the lip synch from one playback to another if the Streamed option is not triggered.

Downloads: Mouth chart (PNG)