FAQ

Q: What is LiveAction?

A: LiveAction is a plugin for Unreal Engine 4, that streams animation from mocap systems onto a character in UE 4

 

Q: What mocap systems are supported?

A: Vicon Blade, OptiTrack Motive and XSENS MVN

 

Q: What other systems are support?

A: It is possible to create streaming via VRPN. Contact IKinema Support for additional information

 

Q: What can LiveAction utilise from the data stream?

A: Solved skeletons and rigid bodies

 

Q: Can LiveAction be used for props?

A: Yes, there is simple setup to stream a rigid body into Unreal. However, the object must be a skinned mesh with at least one joint.

 

Q: What are the requirements for a proper avatar?

A: The UE4 accepts all skeletal mesh imports. Good characters are considered with single joint hierarchy, no internal scales, no non-uniform scales, no multiple skin bind poses, no groups, locators and meshes between joints. Good smoothing groups/valid normals and binormals, polygonal meshes skinned only, no multiple joint hierarchies and a root bone.

 

Q: My character doesn't have root bone, why do I need it?

A: IKinema LiveAction doesn't require it. However, if you plan to use the UE4 Sequencer to  record you should have root bone for proper animation recording

 

Q: What type is the mocap workflow?

A: Retarget or rigid body solve

 

Q: Can LiveAction solve from marker cloud?

A: Not at this stage. This will be added in future releases.

 

Q: I have an FBX animations for my character can I use them with LiveAction?

A: No. LiveAction is a direct stream from your mocap system. You might need to look at IKinema Action for games – also a product for Unreal Engine 4 by IKinema.

 

Q: Can LiveAction be used along with game logic?

A: Contact us for additional information.

 

Q: Does LiveAction support facial solve?

A: Not at this moment. You can use third-party plugins that stream and utilise facial data along with LiveAction. LiveAction doesn't exclude usage of additional workflows.

 

Q: Does LiveAction support gloves?

A: No but you can use the gloves by applying them to the fingers hierarchies with native Unreal nodes. LiveAction doesn't exclude usage of those workflows. Make sure however that you exclude the joints from the IKinema rig that are connected to the gloves.

 

Q: Is there a restriction to humanoid characters?

A: No. You can directly map source to target – fully or partially with or without FK joint source. Please see our advanced demos on our youtube channel.

 

Q: Does LiveAction handle non-proportional source and target?

A: Yes. The IKinema rig is highly customizable. You can manipulate task offsets and reduce or eliminate the FK source for example.

 

Q: How many characters can be solved simultaneously with LiveAction?

A: There is no limit. Depends on how many characters the mocap system can handle.

 

Q: Does LiveAction handle proper positioning in the scene and interaction with props the way they are in the mocap stage?

A: Yes. Even if your avatars are with an import scale very different from the scene scale. In order to make it work make sure you:

-Set up a proper Import Scale in the Rig file;

-Place all avatars and props in same reference point (for example at 0 0 0 in the scene. You can translate them whenever you want, just keep them with same reference translation)

-Apply the inverse scale to match the scene size. This means you scale what's placed in the Unreal scene avatar and use for its uniform scale the formula  (1/Import Scale)*UnitConvertion. Unit conversion means cm to meters and similar. For example, for OptiTrack, it's *100.

 

Q: I can't see the IKinema Create Rig” option?

A: Please verify you have the LiveAction plugin loaded and its still valid and the license has not expired.

 

Q: My trial project doesn't work today but worked yesterday?

A: Please see the Unreal 4 Output log, to see if your license has expired

 

Q: Is there a fast way to match the source pose to avatar pose?

A: With Optitrack and XSENS you can use the Zero Source Rotations checkbox in the rig editor. It is currently not available for Vicon Blade. Please stream a single frame with your desired, defined pose.

 

Q: Saving and reusing matching templates doesn't work?

A: They might not have been stored if Unreal 4 doesn't have permission to write to your drive. Restart UE4 and run it as an Administrator.

 

Q: Suddenly Unreal 4 becomes slow when I open a project with LiveAction.

A: Please check your animation blueprint. Server configuration might be incorrect and UE4 will constantly try to reconnect for about 30 seconds. To avoid this when migrating projects or changing servers save the project with the Reconnect checkbox off, in the Mocap Stream node.

 

Q: I can't connect to the mocap server?

A: First see if you have a valid license. There is an error message in the Output Log if you don't. Then see you have filled the proper server settings in the MocapServer node and proper system type in the Retargeting node in the blueprint. Last see our documentation page and see what settings and ports you should use for each mocap system. Also, make sure you check Reconnect and compile. In rare cases try closing and starting Unreal 4.

 

Q: I recorded animation with UE4 sequencer but my character is twisted and messed up?

A: This happens if your avatar doesn't have root bone. Create one and reimport the skeletal mesh. LiveAction rig should be fine with no need to do the set up again.

 

Q: I put hands tasks and the solver became unstable?

A: As a rule of thumbs, when you create hands tasks immediately reduce the arm joints retargeting gain. For example from the default 1 to 0.5.

 

Q: Floor penetration doesn't seem to work?

A: Please verify that your floor has collision type and is a collidable object for your character. A fast test is to drop a plane in your scene and see if the penetration will work with it.

 

Q: Rigid Bodies are streaming into the IKinema Rig Editor at 0,0,0

A: If all of your streaming settings are correct (reference LiveAction documentation) and your Rigid Body data are still being imported  at 0,0,0. Please do the following:

1: In your Unreal project, enable console command and enter "a.UseUniCast 1"

2: In your Motive project, change the "Transmission Type" from "MultiCast" to "Unicast".

Note: When doing a rigid bodies setup, do not pause playback in Motive. Only play the range when the actor is in its desired pose and once the data is imported to the IKinema Rig you can go back to playing the entire data.

 

Q: I've set up all the correct parameters but when I import my source data into the IKinema Editor there is no skeleton and the "Import Scale" is at 0?

A: If all of your import settings and streaming settings are correct and you see that the "Import Scale" is set to 0 when you try to import your mocap source. Please check that the first bone in the Hierarchy is your characters Hip bone. By default, LiveAction removes the "Root Bone" from the solve however if there are other bones that are higher in your skeleton than your characters Hip bone then please create a new IKinema rig for your character and delete these excess bones from the IKinema Rig Editor. You can do this by selecting the bone/bones and pressing delete when selected in the Hierarchy panel in the IKinema Rig Editor.

Your character hierarchy should look like the image below (Notice the Hip bone is first in the hierarchy).

 

Q: How do I find my streaming machines address?

To find your machine’s address follow the steps below:

- Open ‘Command Prompt’

- Enter ‘IPconfig’

- Look for your  ‘IPv4 Address’