0 - Introduction
This tutorial is intended to accompany the HumanoidVR demo project found on our demo and assets page here.
Here is a video of the same set up placed in a different scene: Click.
The project itself is complete and contains all of the material covered in this tutorial. It is recommended to review it and see the logic contained in the blueprints in action!
The Humanoid project is very similar to the gnomeVR project (video tutorials found here: part 1, part 2) with a few key differences, mainly the new specially tuned, pre-oriented "Left and Right Motion Controller Tasks" within the IKinema Rig.
Upon completing this tutorial, we will have covered the following topics.
- Project Creation.
- Setting up the Character Blueprint.
- Adding Necessary Components To The Character Blueprint.
- Assigning Meshes To Components.
- Aligning The VR System.
- Setting Up The Blueprint Logic.
- Creating The IKinema VR Rig.
- Setting Up The Characters Animation Blueprint.
One of the mandatory steps in achieving full body procedurally generated animation with IKinema's RunTime, is creating an IKinema Rig. This applies in VR too, with a specialized "VR Rig". In the IKinema rig, the user can specify exactly which bones should be given constraints. A constraint can consist of either (or both) a positional task and an orientation task.
The end result of IKinema's IK solver is to pose the entire body of your character in order to allow the bones to meet their assigned position/orientation tasks. Within the rig, the user is given complete control over a number of parameters that alter the behavior of the IK Solver, to fine-tune the end results. For the sake of this tutorial, we will be using default values where possible since they have been extensively tested to give reliable behavior.
For VR, we are primarily interested posing your in game character based on the real-life poses of the player. With motion controllers, we are able to get the position and orientation of your players hands and head which we can then use as targets for the hands and head of your character in-game.
Getting this data will take place in the character blueprint within Unreal Engine and then accessed from the animation blueprint to be fed to the IKinema for VR node. The output of this node will be procedurally generated animation that follows your players upper body movements.
1 – Project Creation
The first step is to create a project. Upon launching Unreal Engine, we will be met with the project creation prompt. Make a blank project with starter content included.
Name your project and click on create.
2 – Setting Up Character Blueprint
With a new project created and open, the first step is to create a blueprint for your VR character.
In the content browser, navigate to a convenient folder to store your character blueprint in.
In said directory, right click in the content browser -> click on "Blueprint class".
A "Pick Parent Class" dialogue will pop up. Choose "Character" as the parent class as below. This will create a new blueprint which includes a number of components and preset options set by default.
Choose a suitable name for your blueprint and click OK.
With the blueprint added and visible in the content browser, double click on it to open up your newly created blueprint.
2.1 - Adding The Needed Components To The Blueprint
In order to receive data from the motion controllers and see a character in game we will need to add some components to our blueprint.
With your blueprint open, select the "Viewport" so we will be able to see our character and components as we add them.
In the Components tab there will be a number of components already added such as a "Mesh" component" and a "Skeletal Mesh" component. These are included by default when you inherit from the Parent class.
For a VR character, we will need to include a few other components to allow for VR functionality. These are primarily "Motion Controller Components", but we will also include some static meshes for the purpose of debugging. We also need to define a point of origin in our VR system in the form of a scene component which will be the parent of our controllers.
The blueprint will contain the capsule component, the arrow component and the skeletal mesh component on creation by default. You will need to add the following components and copy the hierarchy as below. This will look as follows:
The hierarchy is important as it defines the coordinate space of our VR system. We will need the Motion Controller Components and the Camera component to be children of the "VROrigin" scene component as above.
2.2 - Assigning Skeletal/Static Mesh And Motion Controller Type
In the character blueprint we will need to assign a the skeletal mesh we wish to use for our character.
First, with the "Mesh" component (what will be the skeletal mesh of our character) selected, look for the "Skeletal Mesh" option in the Details panel in your blueprint window shown below.
Browse to your desired asset and select it. In our example, we will use "IKinemaManX_VR".
For the purpose of debugging, we will also assign out static mesh motion controller components a mesh. This gives us the option to make them visible when in game, allowing us to visually debug any problems.
Do the same for the character mesh for the static mesh components who are children of the "Left/Right MotionController" components (LeftControllerMesh and RightControllerMesh in this tutorial).
The asset we want to assign to these are included with the IKinema RunTime plugin, they are called "L_ViveMotionController" and "R_ViveMotionController" respectively.
Note: If you cannot find these assets - in your content browser, click on the "View Options" icon in the bottom right. With this open, tick "show plugin content". They will now be visible if you search for them.
For the LeftMotionController and the RightMotionController, set their "Hand" option to left and right respectively.
2.3 - Aligning the VROrigin And Motion Controllers.
In your character Blueprint, the VRORigin/Motion Controller Components/Camera Component must all be aligned with one another and positioned at the characters head to match what would be its view port.
Set the locations of the Motion controllers and camera component to (0,0,0) in the "Transform" section in the details tab. This will align them to the VROrigin.
Then position the VROrigin so that the camera is positioned where the character would see from.
2.4 - Setting Up The Blueprint Logic
In your character blueprint event graph we will need to include logic that allows for the following:
Correctly aligns the HMD with the in-game Camera.
Prompts the user to enter a T-Pose and press a combination of buttons when they are ready. This is covered in detail here: (https://ikinema.com/index.php?mod=documentation&show=184&id=304).
Calculate the positions of the motion controllers and HMD in real world to use as target locations for your characters arms/hands in-game. This is shown below.
These transforms are calculated in component space which is needed for our animation blueprint. We will set up our Animation Blueprint to access these variables which will be used by the IKinema solver.
Since this logic falls outside of the realm of IKinema features, please refer to the character Blueprint in the sample VR project included at the top to see the full implementation of logic used.
3 – Creating the VR IKinema Rig
Navigate to your characters skeleton in the content browser, right click on it -> select "IKinema Rig Actions" -> "Create VR IKinemaRig".
This will give you a warning message about the number of bones in your IKinema Rig. Click OK
Open your newly created IKinema Rig .
Here, select all non essential bones and delete them. The essential bones that should be remaining are shown below.
Now assign the following bones specific tasks as follows:
Head bone – VR HMD Task
LeftHand – Left Motion Controller Task
RightHand – Right Motion Controller Task
LeftFoot – Foot Task
RightFoot – Foot Task
Hips – Hips Task
LeftLeg – Knee Task
RightLeg – Knee Task
All of the tasks have default values that have been tested to ensure responsiveness and accurate IK solving. You do not need to edit these in order to get good behavior.
When you assign the HMD and Motion Controller tasks, they will automatically align with the characters hands and head to ensure good correspondence between your characters hands and those of your player.
4 - The Animation Blueprint
The Animation Blueprint is where we combine all of the above to handle the animation of our character in game.
We will need to get the head and hand transforms we receive from our motion controller set up in our character Blueprint.
With these variables stored in our Animation Blueprint as below in the event graph, the next step is to place an instance of the "IKinema for VR" node.
In the Animgraph, right click and search for IKinema for VR. You should be able to see a node which uses the rig we created earlier.
Click on this to place the node.
The node must be set up with supporting nodes as shown below. The setup tasks take the transform data coming from the VR motion controllers and HMD (in your characters component space) and use these as target positions for your characters head and hands.
You must set the alpha values for these tasks to 1.
IMPORTANT: Ensure that the 'Hips Transform' option is set to the first bone included in your IKinema Rig (otherwise you will not get any solving).
Connect the output of the "IKinema For VR" node to the "Final Animation Pose". You are now free to test out your HumanoidVR project.
Bonus - Crouching Behavior:
This is still experimental but we have been seeing very interesting results shown in the video at the top of this tutorial. You can drive the hips of your character by freeing up their positional task along the vertial axis. This will move the hips up and down with the motion controllers causing your character to crouch with your player.
To do this, in the IKinema Rig, with the constraints group selected, select the hips task. Set the Position weight in the Y direction (Y is usually the up axis in solver space, see below) to zero. This frees up the hips to move along the vertical axis with your player.
NOTE: The positional weights are in 'solver space'. Solver space is the coodinate system centred on the parent of the hips in the skeletal heircahy which is usually the Root Bone component. In this space 'Y' is usually the vertical axis.