Version 1.20 of RUIS for Unity is now available for download. The below video demonstrates some of the new features that have been added since the last release; most of the new features relate to avatar body modification and support for different motion capture systems.
The RUIS toolkit is useful when developing applications for VR studies or VR experiences that employ avatars: as in the previous RUIS version, any rigged humanoid 3D models that can be imported to Unity will work as motion captured full-body avatars. They can also be easily embodied from a first person perspective when using a VR headset.
The new version has a variety of avatar modification parameters (body segment thickness, translation & rotation offset, etc.) that can be used to customize the avatars. The parameters can be scripted and adjusted in real-time to create body morphing effects.
In contrast to the previous RUIS versions which worked mainly with Kinect v1 and v2, the new version makes it easy to utilize any motion capture systems, whether optical- (e.g. OptiTrack, Vicon) or IMU-based (e.g. Perception Neuro, Xsens). These full-body motion capture systems can be paired to work with any VR headsets that are supported by Unity, so that the headset pose is tracked with its tracking system; this is in contrast to existing solutions offered by e.g. OptiTrack and Vicon, which required you to use their motion capture system to track everything, including the VR headset. That results in added latency and inability to utilize time/space-warp features of Oculus or HTC Vive.
As it is, this newest RUIS version is a bit rough around the edges and still contains a lot of legacy stuff that manifests itself as ~100 deprecation related warnings upon compilation. I hope to release a new version by summer, which fixes or mitigates remaining issues.
The documentation is also not complete, and my plan is to update it in April. Meanwhile, below you can find screenshots and explanations about the new features that are otherwise undocumented.
Below cropped screenshot of Unity Editor presents an example of using OptiTrack motion capture system to animate avatars in real-time with RUIS. First, you need to import the Unity plugin of your mocap system into your project. Then create GameObjects for each tracked body joint, whose Transforms will be updated with the world position and rotation of the tracked joints. In this example the updating is achieved via the OptitrackRigidBody script, that comes with the MotiveToUnity plugin. In the case of OptiTrack, stream the skeleton as individual rigid body joints instead of streaming the skeleton data format. Please note that there is no “AvatarExample” scene within the RUIS release. You could use e.g. the MinimalScene example as a starting point for your avatar experiments in RUIS.
Your avatar GameObject has to have the RUISSkeletonController script. At first use the ConstructorSkeleton prefab as a ready made example. When using other motion capture systems besides Kinect, you need to make sure that the “Body Tracking Device” field is set to “Generic Motion Tracker”. Note the two settings pointed by the magenta arrows, which should be enabled when using a IMU mocap suit (e.g. Perception Neuron, Xsens) together with a head-mounted display.
Scroll down in the RUISSkeletonController script Inspector to see the “Custom Mocap Source Transforms” -section. Drag the aforementioned GameObjects (that will be updated with the joint poses) into the corresponding joint fields. Notice the “Coordinate Frame [and Conversion]” field outlined by the magenta rectangle. That setting links your motion capture system with a specific coordinate frame (“Custom_1”) for this avatar, and allows you to apply any coordinate alignment and conversions that are required to make the avatar function properly in Unity and together with other input devices supported by RUIS.
To access the coordinate conversion settings, you should enable the associated “device” (Custom 1) from the RUISInputManager script. You only need to adjust these settings if the avatar ends up being animated wrong, for example if the joints point at different directions in Unity than in the motion capture software. The below example shows what “Input Conversion” settings are needed to make the avatar work properly with joint data that is streamed from OptiTrack’s old Motive 1.0 software from early 2013. Basically the input conversion is used to make the streamed motion capture joint position and rotation format to conform with Unity’s left-handed coordinate system.
If you are using an optical motion tracking system (e.g. OptiTrack, Vicon) together with a VR headset like Vive or Oculus Rift, then in most cases you want to align the coordinate frame of the motion tracking system with the coordinate frame of the VR headset’s tracking system. Such alignment happens via a calibration process, and it is not necessary when using a IMU mocap suit (e.g. Perception Neuron, Xsens) together with a VR headset. The calibration occurs in calibration.scene that comes with RUIS. You need to edit the scene so that the “Custom 1 Pose” GameObject’s world position and rotation will have their values from a joint that will be streamed from your motion capture system. If necessary, also edit the “Input Conversion” settings of the RUISInputManager script that is located at RUIS -> InputManager GameObject of the scene.
You can align the coordinate frames of two input devices by running the calibration.scene in Unity Editor. Alternatively, you can initiate the calibration process in run-time via RUIS menu, which can be accessed by pressing the ESC key in any of the example scenes that come with RUIS. Use a mouse to click the green button under the “Device Calibration” label, to open up a drop-down menu of devices that can be aligned; this depends on the enabled devices and detected VR headsets.
Once you have selected the device pair from the drop-down menu, click “Calibrate Device(s)” button to start the process for aligning their coordinate frames.