Face Robot - Troubleshooting

Table of contents

Workflow

How do generate a capture (playblast)?

See: Capturing Animation in a Viewport (Flipbook) (http://softimage.wiki.avid.com/xsidocs/ani_play_CapturingAnimationinaViewport.htm)

How do import my reference video in Face Robot?

It makes sense to dedicate a User view to your video reference since you are constantly referring to it. Just set a view to User and enable Rotoscope. Then pick New>NewFromFile... and browse to your video. If you go into the Edit options, you can adjust the start frame as needed. To clean up the view, uncheck all of the options under the eyeball icon, and under Visibility Options uncheck everything in the Visual Cues tab and the Stats tab (although it's nice to have the Current Time option on). This gives you a clean reference window that you can sync to your animation. Set a memo cam in the view to get back to it easily.

See: Tools for Viewing Image Clips (http://softimage.wiki.avid.com/xsidocs/tex_clip_source_ToolsforViewingImageClips.htm) or Rotoscopy  (http://softimage.wiki.avid.com/xsidocs/rotoscopy.htm)

How do I import audio and synchronize it with my mocap?

You can bring audio into any mixer in your scene. The Scene Root mixer works well and is easy to get to, or you can put it under the face model for your character. Just add an audio track and browse to the sound file that you want. If the audio was recorded with your reference video, then syncing to the reference is easy since they are both offset by the same amount. To have the audio and video sync to your mocap, you can go to the Settings>Adjust panel and set a start frame for the mocap. If you toggle the face off you can just watch the controls as you change the start frame slider and find the frame that you want. If you don't have a system that syncs your video to your mocap take, then you can use a blink or some other landmark in the reference to sync to.

See: Overview of Synchronizing Audio with Animation (http://softimage.wiki.avid.com/xsidocs/audio_OverviewofSynchronizingAudiowithAnimation.htm)

Solving

Why does my character's mouth remains sealed when trying to open it?

This is usually caused by the solver not have a good enough idea of the facial construction of your model.

If you go back to your Fit scene you will notice two ear clusters have been created on the mesh (LEarCluster and REarCluster). Each cluster should accurately contain all the points of their respective ear. This tells you FaceRobot is building an accurate representation of your head for solving. If the correct points are not selected simply add or remove points to the clusters till the proper area is selected. Do this for both ear clusters and resolve your head. The mouth should now behave properly when opening.

Why is it taking so long to solve my head? It seems to hang on 'Forming Mouth'.

If your face mesh is very dense or if the mouth cavity has the tongue, gums or other details modeled into it, it can cause this part of the solve to take longer than usual. You can speed up the solve by selecting the vertices associated with these details and putting them into a cluster named 'IgnoreMouth'. If this cluster exists, the solve will not take these verts into account.

This is also a good way to troubleshoot imperfect mouth solves. If the geometry is more complex than usual, or if the mouth is modeled in an unusual way, creating an IgnoreMouth cluster can help Face Robot to focus on the parts of the mouth that are important for the solve.

Tuning

There is a lot of stuff in this stage. What do I do first?

The tuning stage in Face Robot is fairly nonlinear, and the order in which you do things depends largely on taste and personal workflow. However, the following workflow works well in most cases:

Generally, you will want to hit the big ticket items first. Make sure the mouth is working like you expect and regenerate it if necessary. Check the jaw to see how it is working. If your picking or fitting was way off here, you may need to resolve the head and you won't lose any work if you check this first. After that, make any adjustments to enveloping that are needed and rough in the wrinkle map and puffer map. Now you can work on the face region by region to tune the region's range and strength and also adjust the region map through painting. If sculpting is necessary for a region, do that while you are there. Adjust the eye curves and maps, and then set up the mouth's collision and other parameters to your liking.

While working through these steps, you will probably find yourself in and out of wrinkle painting, envelope painting and sculpting. This is the great thing about tuning in Face Robot; you can go back and forth as you like to continually refine your work. But having some sort of system definately makes it easier to keep up with your progress.

How do I choose between painting, sculpting, and shape animating?

The general workflow when tuning the skin or puffers is to start with painting, this allows you to define how big or small each muscle region is as well as the relative amplitude of the muscle. The skin sculpting is to be used more for fine detail shaping once the overall muscle area is behaving properly. You should be able to acheive most of the facial deformation necessary using these two techniques.

For specific tissue deformation effects, especially around the mouth (for ex. enhancing an "ooh" shape), using the traditional shape animating tools can then be used ontop of the solver.

How do I set my paint tool to update as I'm painting instead of updating only after I've made my brush stroke?

In the Paint tools, toggle on "Interactive Refresh".

How can i paint the muscle regions on my face?

From the Tune menu, Shift+click on any muscle region to activate the muscle for painting. You can use the standard brush tools to add or subtract from this region.

The mouth corners bulge when opening the mouth. How can i prevent this?

This is the result of the mouth corner pick point being too outside the mouth corner. The goal is to select the area where the inflection point changes from outside the mouth to inside the mouth. Usually the mouth has an edge loop that goes directly into the mouth corner so it is best to use that edge as a guide when making your selection.

What are Create/Show Stress, Create/Show Region and Create/Show Tendon Map? Face Robot doesn't allow me to paint on them; why?

Primarily, Region map gives a visual feedback showing the location where the skin is getting compressed when controllers are moved. Secondly, it can be used to drive parameters of a shader applied on the head. As an example, it could control the intensity of a bump or displacement map. Stress is identical to the Region map but minus the Wrinkle or the Puffer Mute map. Tendon is also similar to the Region map less the Tendon map.

Animating

How do I preserve the keys on the adjust panel when re-applying the mocap?

You may have noticed that the adjust panel gets cleared and rebuilt every time you apply a mocap, or when you clear the mocap. This usually makes sense, because the adjustment keys are specific to a particular mocap sequence. However, there are times when you may have made some tweaks to a C3D file and want to refresh the mocap with a newer version of the same mocap. Obviously, in this case, you want to preserve any keys you have on the adjust panel. To enable this, we have two simple commands.

  1. First, you'll need to open the script editor (ALT-4).
  2. Then, you need to take a snapshot of the current adjust panel keys and store them in memory.
  3. Type the following command (VBScript mode):SaveAdjustPanelKeys
  4. Type the following command (JScript mode):SaveAdjustPanelKeys();
  5. Press "Run".

Now you can re-apply the mocap. As expected, the adjust panel is rebuilt and it no longer has the old keys.

  • To restore them, type the following command and press "Run":RestoreAdjustPanelKeys

Again, remember to add "();" at the end if you're using JScript.

Mocap

For general information, see Motion Capture in Face Robot.

What is a C3D file?

The C3D format is a public domain, binary file format that is used in Biomechanics, Animation and Gait Analysis laboratories to record synchronized 3D and analog data. It is supported by almost all 3D Motion Capture System manufacturers, as well as other companies in the Biomechanics, Motion Capture and Animation Industries. See: c3d.org (http://www.c3d.org/c3d_introduction.html)

What are fmaps and nmaps?

Face maps (fmaps) and name maps (nmaps).

Do I need separate fmap for every C3D data file?

It depends on the situation: if the mocap has been taken all within the same session (marker placement remains unchanged), you need to generate only one fmap for the entire session. The only requirement is that all the motion capture files exist in the same folder as the generated fmap. If this is so, Face Robot will then use the single fmap to calibrate all the mocap data in the folder.

In the case you are using the same actor but across different mocap sessions (so the markers are removed and reapplied in between), you will need to generate a seperate fmap for each session. This is to account for any slight position shifts of the markers between sessions.

The last case is that if you are using different actors, each one would require their own fmap for calibration.

What if my mocap does not start with a neutral pose?

The neutral pose need not be at the start of the sequence, Simply find any frame in which there is a neutral pose and generate your fmap while on that frame.

Why isn't the head moving?

By default, the head translation and rotation are disabled in the Adjust ppg. Simply enable them (Adjust>Head controls) and set the translation scale to 1 to get the full head translation. If your head still does not move it could be the mocap data you are using is stabilized, meaning the head translation and rotation have been filtered out of the mocap clip. Use the unstabilized version if you want head motion. The last posibility is the mocap take has very little head motion, check your original mocap clip to confirm this.

How do I edit the mocap motion curves?

In order to access the curves, you need to plot the mocap data first. This removes the mocap controls and plots the motion data directly onto the Face Robot controllers. You can then access the curves (called fcurves) using the animation editor.

Importing and Exporting

What's the best way to bring in a mesh from other packages (Max, Maya, etc.)?

There are several options for importing geometry into FaceRobot. The important point being that the mesh point order remains the same so the animation data can easily be applied back to the original mesh through Point Oven or other cache system. Dotxsi, Collada and Obj all maintain the point order when importing the mesh so are good options.

What is the Contour?

It's a markerless motion technology developped by Mova (http://www.mova.com/). See: Contour Reality Capture (http://www.mova.com/technology.php?t=capture)

How do I import a Contour file?

  1. File -> Import -> Import Contour.
  2. Pick the first .obj of an .obj sequence.
  3. You'll need to place the timeline cursor at the frame number of the .obj file to see the result.

What is the Point Oven?

Point Oven is a commercial suite of plug-ins designed to bake vertex and fcurve data to streamline pipelines, and to transfer data between different applications. Point Oven can be used to simplify your existing scenes by baking complex deformations, or to share data with other users who have Point Oven but may use a different 3D application. Point Oven currently supports Softimage, Maya, 3ds Max, Lightwave and Messiah. See:Point Oven (http://www.ef9.com/ef9/PO1.5/PointOven_15.html).

This page was last modified 20:29, 6 Aug 2009.
This page has been accessed 12009 times.

© Copyright 2009 Autodesk Inc. All Rights Reserved. Privacy Policy | Legal Notices and Trademarks | Report Piracy