RUIS for Unity

Version 1.06

- Tuukka Takala

technical architecture, implementation

- Mikael Matveinen

implementation

- Heikki Heiskanen

implementation

For updates and other information, see http://ruisystem.net/

This is the free version of RUIS, where reading and writing XML files (inputConfig and display) is not functional. Full version is coming to Unity Asset Store later.

Introduction

RUIS (Reality-based User Interface System) gives hobbyists and seasoned developers an easy access to the state-of-the-art interaction devices, so they can bring their innovations into the field of virtual reality and motion controlled applications. Currently RUIS for Unity includes a versatile display manager for handling several display devices, and supports the use of Kinect 1, Kinect 2, Oculus Rift DK2, and PlayStation Move together in the same coordinate system. This means that avatars controlled by Kinect can interact with PS Move controlled objects; A player represented by a Kinect-controlled barbarian avatar can grab a PS Move controller that is rendered as a hammer within the application for example.

Quickstart

Try example scenes at \RUISunity\Assets\RUIS\Examples\ -directory. You can develop and test your own motion controlled applications even if you have only a mouse and keyboard, because in RUIS they can emulate 3D input devices.

Most RUIS scripts have comprehensive tooltip information, so hover the mouse cursor over any variables of RUIS components in Unity Editor’s Inspector tab to learn about RUIS.


Known issues

Installation

RUIS for Unity requires Unity 4 (both Windows and OSX are supported). It has been tested with version 4.3.4.

Optional drivers and software

Installing OpenNI for Kinect 1 / ASUS Xtion / PrimeSense Sensor

You only need to follow through this section if you plan to use Kinect 1 with RUIS on your computer, otherwise you can skip this. RUIS for Unity takes advantage of “OpenNI Unity Toolkit” that requires Win32-bit version of OpenNI 1.5.4.0 (OpenNI 2.0 is not yet supported though). If you have Windows 8 or “Kinect for Windows” you should also read the Troubleshooting section in the end of this readme.

You need to install OpenNI and NITE middleware before using Kinect 1 in RUIS. Check your installation validity by running the NiSimpleViewer example application at \OpenNI\Samples\Bin\Release\ directory. If it shows depth image from Kinect 1, you have successfully installed OpenNI.

Windows 7 / Vista / XP

Before installing Kinect 1, make sure that you have uninstalled all existing OpenNI, NITE, Primesense, and SensorKinect instances from Control Panel’s “Uninstall a program” section, and reboot your computer. Download the following OpenNI installation file package:

https://drive.google.com/file/d/0B0dcx4DSNNn0WVFwVExDRnBBUkk/edit?usp=sharing

Unzip the downloaded package, and install its files in the following order (If the download link was dead, you need to google for the below files):

  1. OpenNI-Win32-1.5.4.0-Dev1.zip
  2. NITE-Win32-1.5.2.21-Dev.zip
  3. SensorKinect093-Bin-Win32-v5.1.2.1.msi
  4. Sensor-Win32-5.1.2.1-Redist.zip

Windows 8

Using the same files as for Windows 7, follow this procedure:

  1. Uninstall any existing OpenNi, Nite, and the Kinect 1 drivers.
  2. Windows key + R to open the run prompt
  3. shutdown.exe /r /o /f /t 00
  4. Select Troubleshoot
  5. Select Advanced
  6. Select Windows startup and then restart
  7. Enter the option for Disable Driver Signature
  8. Reinstall OpenNi (32-bit version), Nite, and the Kinect 1 driver.

OSX

Kinect for OSX is not supported at the moment. RUIS uses “OpenNI Unity Toolkit” that supports Win32-bit version of OpenNI only.

Oculus Rift with Kinect, PS Move, and Razer Hydra

RUIS features MecanimBlendedCharacter prefab, which is a feature rich character controller for applications with first- or third-person view. This prefab’s scripts automatically use Oculus Rift orientation tracking and combine that with positional tracking from either Kinect, PS Move, or Razer Hydra. The tracking device is decided in runtime depending on which devices are enabled in RUIS’ InputManager.

The MecanimBlendedCharacter prefab contains a human 3D model that is animated with Kinect 1 or Kinect 2. You can substitute the default model with your own. Mecanim walking animation overtakes pose input from Kinect whenever the player is moving the character either with keyboard, gamepad, PS Move Navigation controller, or Razer Hydra controller. You can use your own Mecanim animation graph and use RUIS features to write a script that blends Mecanim animation with Kinect pose data in real-time. See KinectTwoPlayers or OculusRiftExample at \RUISunity\Assets\RUIS\Examples\ and modify the MecanimBlendedCharacter gameobjects to get started. When importing MecanimBlendedCharacter prefab to a new Oculus Rift scene, you need to toggle the “Enable Oculus Rift” option from the RUISDisplay component that is found in the following gameobject: RUIS DisplayManager Main Display (name of your display).

You can make a third-person application with MecanimBlendedCharacter if you remove its RUISHeadTrackerAssigner component and the HeadTrackers gameobject parented under it, and create a new RUISCamera that follows the MecanimBlendedCharacter. A more simplified version of  MecanimBlendedCharacter is ControllableCharacter prefab, that is animated by Kinect and has the same features but does not include components for blending Mecanim animation.

By default the MecanimBlendedCharacter is affected by gravity, so you should place it on a static Collider. It has PSMoveWand and RazerOffset child gameobjects that each contain a 3D wand that can grab and manipulate objects, when the corresponding devices are enabled. You can delete or modify those objects. The JumpGestureRecognizer child gameobject contains scripts that make the character jump if Kinect is enabled and detects the player jumping. The recognition is far from perfect however, and you can disable the JumpGestureRecognizer if you like.

The MecanimBlendedCharacter prefab uses the Constructor character 3D model that you can replace with your own character model. In that case you need to relink the body part gameobjects in the RUISSkeletonController component (see the right side of the above image), found in the Constructor gameobject. If you don’t use Kinect to animate your character, then you might also need to adjust the Y-coordinate of the StabilizingCollider gameobject’s Transform.

Only Oculus Rift

Use mouse and keyboard for controlling the MecanimBlendedCharacter. See Avatar Controls section below.

Oculus Rift + Kinect

Place the Kinect so that it can see you and the floor. The avatar’s pose and limbs’ length is tracked by Kinect, and you need to stand in front of the Kinect during gameplay while wearing the Oculus Rift. You will need a long display cable and an USB extension cord. Some people have gone wireless with the Rift using Asus Wavi and a custom battery pack. We recommend that you use a wireless gamepad to control the MecanimBlendedCharacter locomotion if only Kinect is enabled.

Oculus Rift + Razer Hydra

Place the Razer Hydra base station on your desk so that its cable ports are facing away from you, like you would do with any Razer Hydra game. One controller (RIGHT) will be wielded normally in your hand and is used for avatar locomotion, grabbing objects, etc., while the other (LEFT) needs to be attached on the left side of your head for head tracking:

You can use e.g. a rubber band to tie the Razer onto the Rift’s strap. When the scene starts, it asks you to point the LEFT controller (on the head) towards the Razer Hydra base station and to press the trigger button. Same is repeated for the hand-held RIGHT controller.

Oculus Rift + Razer Hydra + Kinect

HIGHLY EXPERIMENTAL: In this head tracking scheme Kinect provides a rough estimate of the Razer Hydra base station location, which is combined with the Razer controller’s local position. Apply the procedures from the above two sections (Oculus Rift + Kinect and Oculus Rift + Razer Hydra) and make sure of the following: 1) The Razer Hydra controller must be in the same pose as in the above photo with relation to your head (i.e. facing downwards in a 90 degree angle). 2) Attach the Razer Hydra base station in the front of your belly in a 90 degree angle, so that its cable ports are pointing towards the floor:

Duct tape is the easiest and messiest option. The base station should be tightly attached against your stomach to minimize any wobble when you’re moving (such movement will be reflected in the head tracking). Depending on your height, belly, and how accurately you placed the base station on your belt, the viewpoint can initially appear outside of the avatar. You can adjust that offset with panning and base pitch angle buttons of the LEFT Razer Hydra controller. See Head tracking controls section for details.

Oculus Rift + PlayStation Move

If PS Move is enabled, it will be used for head tracking regardless of other enabled devices. Attach the PS Move controller designated as GEM[0] in Move.me software on the topmost strap of Oculus Rift (two rubber bands work well):

The strap and the rubber bands should be kept tight to minimize any controller wobble when you move your head. The Move button should be pointing up towards the ceiling when you’re standing straight. PS Move GEM[1] acts as a 3D wand that can grab and manipulate objects. If you want to use PS Navigation controller to make the MecanimBlendedCharacter walk, run, and jump, then be sure that the “Controller ID” in RUISCharacterLocomotion component of MecanimBlendedCharacter corresponds to the ID that can be seen in the Controller Settings of PlayStation. You can access those settings if you press and hold the PS Navigation controller’s PlayStation button.

If both PS Move and Kinect are enabled, then you need to calibrate their coordinate systems by displaying the menu (ESC), selecting “Kinect - PS Move” from the Device Calibration drop-down menu, and clicking the “Calibrate Device(s)” -button (see Kinect and PS Move calibration section for details).

If PS Move is enabled and Kinect is disabled, you may need to edit y-value of translate element in the file ‘calibration.xml’ for the head position to appear at correct altitude.

Other Features

DisplayManager

You can have your Unity application render 3D graphics on any number of mono and stereo displays when you use RUIS and run your application in windowed mode. You need to have your displays arranged sideways in your operating system’s display settings, because RUIS automatically creates a game window where all the viewports are side-by-side.

For 3D displays, side-by-side and top-and-bottom modes are supported. RUIS display configuration can be edited through the DisplayManager gameobject that is parented under RUIS gameobject. When adding new displays in RUIS, keep in mind that each display (parented under DisplayManager gameobject) needs to have a RUISCamera gameobject attached to it. If you toggle the “Enable Oculus Rift” option from your RUISDisplay, then the “Attached Camera” will act as a orientation tracked Oculus Rift camera. In order to learn RUIS’ display manager capabilities, see DisplayManagerExample at \RUISunity\Assets\RUIS\Examples\

3D user interface prefabs for selection and manipulation

RUIS for Unity can be used to easily create a custom 3D user interfaces with custom selection and manipulation schemes by the use of so called Wand prefabs. Currently supported wands (input devices) are: MouseWand, PSMoveWand, RazerHydraWand, and SkeletonWand (Kinect). These prefabs are found at \RUISunity\Assets\RUIS\Resources\RUIS\Prefabs\Main RUIS\Input Methods\. To see how to use these prefabs, check out BowlingAlley (its PSMoveHand gameobject) and MinimalScene (its MouseWand) at \RUISunity\Assets\RUIS\Examples\

If you do not have Kinect, Razer Hydra, or PS Move, then use MouseWand prefab that emulates their behavior with a mouse for object manipulation purposes. When a scene is playing, the above mentioned Wands are used to manipulate gameobjects that have a RUISSelectable script, Mesh Renderer, Rigidbody, and Collider components. See crate gameobjects in any of the examples. The selection ray of a Wand is checked against the Collider components (you can have several of them in a hierarchy under one object) of a gameobject to see whether it can be selected (triggered with a button or a gesture in case of Kinect) and manipulated by the Wand.

In RUIS for Unity the 3D coordinate system unit is meters, which reflects to the position values of PSMoveWand, SkeletonWand, and Kinect-controlled avatars. You can translate, rotate, and scale the coordinate systems of SkeletonWand and PSMoveWand by parenting them under an empty gameobject and applying the transformations on it.

Please note that in many 3D user interfaces it makes sense to disable gravity and other physical effects of the manipulated objects; For example, in a CAD interface you don’t want geometric shapes to fall down after moving them.

In the below figure’s RUISPSMoveWand component, “Controller Id 2” corresponds to controller referred as GEM[2] in the Move.me screen.

PlayStation Move controllers

If you have Move.me software for PlayStation 3 and want to use PS Move controllers in your RUIS for Unity scenes, tick the “PS Move Enabled” option in InputManager gameobject (parented under RUIS gameobject). Otherwise keep that option unchecked, because the scene will be frozen for a long time when entering playmode if RUIS is trying to connect to a Move.me server that is not available.

Be sure to set the IP address and port parameters of InputManager to correspond to the ones that Move.me software is displaying. When building your application with PS Move controller support, remember to allow the executable file through your firewall (both UDP and TCP) so that it can connect with Move.me server. Please note that pressing SELECT button will turn off the PS Move controller’s light and that controller is not tracked anymore. This is a feature of Move.me software.

Kinect controlled full-body avatars

If you have successfully installed the drivers for Kinect 1 or 2 and want to use it in your RUIS for Unity scene, make sure to tick the “Kinect Enabled” / “Kinect 2 Enabled” option in InputManager gameobject (parented under RUIS gameobject). You also need an avatar gameobject with “RUIS Skeleton Controller” script. You can use ConstructorSkeleton and Mannequin prefabs that are located in \RUISunity\Assets\RUIS\Resources\RUIS\Prefabs\Main RUIS\Common Objects\ . Please note that you can use rigged models with either a hierarchical bone setup (e.g. ConstructorSkeleton) or a flat, one-level bone setup (e.g. Mannequin). For latter ones, you need to uncheck the “Hierarchical Model” option in the “RUISPlain Skeleton Controller” script.

You can translate, rotate, and scale your Kinect-controlled avatars by parenting them under an empty gameobject and applying the transformations on it.

After calibrating Kinect in RUIS (see “Kinect and PS Move calibration” section of this document), you can take advantage of the following useful features:

  1. Setting the Kinect origin to floor -feature can be turned on from the InputManager gameobject. With this option enabled the Kinect-controlled avatars will always have their feet on the XZ-plane (provided that the avatar gameobjects have model-dependent, correct Y-offsets), no matter what height your Kinect is placed on.

  1. You can tilt your Kinect downwards, and RUIS will use a corrected coordinate system where the XZ-plane is aligned along the floor, preventing OpenNI skeletons from being skewed in Unity. To make use of these features, you need to calibrate the Kinect and PS Move.

Calibrating two different motion trackers to use the same coordinate system

Calibration is needed for using two different motion trackers (e.g. Kinect and PS Move controllers) together in the same coordinate system, and also for aligning Kinect 1 or Kinect 2 coordinate system with the floor of your room. Calibration needs to be performed only once, but you have to do it again if you move either one of the calibrated devices. Results of the calibration (a 3x3 transformation matrix and a translation vector) are printed in Unity’s output log and are also saved in an XML-file at \RUISunity\calibration.xml.

You can calibrate devices by running any of our Example scenes in Unity Editor, pressing ESC key to display RUIS menu, enabling those devices whose drivers you have successfully installed, selecting the device(s) that you want to calibrate (e.g. “Kinect - PS Move”) from the Device Calibration drop-down menu, and clicking the “Calibrate Device(s)” -button. This will start the interactive calibration process, which we will next describe in more detail for Kinect.and PS Move.

Kinect and PS Move calibration

While calibrating, it is important that Kinect sees the floor properly (see the image below for an example). Calibration is done via a special calibration scene ( RUISunity\Assets\RUIS\Scenes\calibration.unity ), that can be run by pressing ESC key and choosing “Calibrate Device(s)” inside your scene (it must have the RUIS prefab). You can return to the previous scene by pressing ESC key in the calibration scene and choosing “Abort Calibration”.

If you are holding a PS Move controller in your hand, the Kinect-controlled avatar’s hand and the PS Move’s virtual representation do not alway appear exactly in the same position because no calibration gives perfect results and Kinect is less accurate than PS Move. Snapping of hand locations to handheld PS Move locations might be added in a future release of RUIS for Unity.

If you only have Kinect 1 or Kinect 2, then the calibration process can be used to align the Kinect coordinate system’s XZ-plane with floor plane. For an example of how to use Kinect and PS Move together, please see the BowlingAlley at \RUISunity\Assets\RUIS\Examples\


NOTE:

Before calibrating Kinect and PS Move coordinate systems to match each other, you should keep thrusting your PS Move controller towards and away from your PS Eye camera, until the “PS Eye pitch angle” in the RUIS calibration scene’s viewport converges within 0.1 degree. This is because PS Eye needs to see PS Move controller moving for awhile before Move.me software can reliably estimate the pitch orientation of PS Eye. When you calibrate after the pitch angle has converged, this ensures that the saved coordinate system calibration between Kinect and PS Move will be as accurate as it can be even after restarting your computer and PlayStation. Please note that Move.me running on PlayStation does not save the pitch angle, and after restarting it, the pitch angle converges again slowly while you are using the PS Move controller in front of the PS Eye camera.

Example scenes

Examples of using RUIS for Unity can be found at \RUISunity\Assets\RUIS\Examples\

Following two points apply to BowlingAlley and other scenes where you want to use PS Move and Kinect in the same coordinate system:

  1. Calibrate Kinect in RUIS (press ESC and choose “Calibrate” when scene is running) so that the Kinect-controlled animated characters will have their feet on the ground properly.

  1. If you have Move.me software for PlayStation 3 and want to use PS Move controllers, tick the “PS Move Enabled” option in InputManager gameobject (parented under RUIS gameobject). Otherwise keep that option unchecked, because the scene will be frozen for a long time when entering playmode if RUIS is trying to connect to a Move.me server that is not available.

OculusRiftExample

This example presents MecanimBlendedCharacter gameobject, which is a versatile beast. Go to InputManager gameobject (parented under RUIS gameobject) and enable any of the following devices that you have connected to your PC: Kinect, Kinect 2, Razer Hydra, and PS Move. These can be used in positional head tracking for Oculus Rift (DK1 & DK2).  When the scene is running, you can control the constructor character with  keyboard, gamepad, PS Move Navigation controller, or Razer Hydra controller. Notice that by default the avatar in this example is set to use Kinect 2, but you can change this to Kinect 1 by modifying the RUISSkeletonController script, located in the Constructor gameobject, which is a child of MecanimBlendedCharacter gameobject.

KinectTwoPlayers

This example demonstrates how you can create a multiuser Kinect application in RUIS. The Kinect avatars are equipped with Collider components so that they can push objects around. Also notice how the Kinect-controlled SkeletonWands can be used for object manipulation. Currently the only selection and release gesture for SkeletonWands is hold, where the selectable object highlighted by a SkeletonWand is selected and released by holding the Wand (your hand) still for 2 seconds while pointing at the object.

BowlingAlley

Bowling with PS Move controller #0: Use trigger button to grab the bowling ball and release it on your throw. Move-button resets the bowling ball position, and triangle-button places the bowling pins. Kinect is used to control a simple mannequin avatar (Mannequin gameobject). Notice how Mannequin’s body parts are all parented below it in a flat fashion and that the “Hierarchical Model” option is unchecked in its script, as opposed to Constructor gameobject parented under the MecanimBlendedCharacter gameobjects in KinectTwoPlayers and OculusRiftExample scenes.

DisplayManagerExample

Run the scene to see how settings at DisplayManager gameobject affect the rendered multi-display output. Additionally you can use mouse, space-key, and WASD-keys to control a simple first person movement. A MouseWand prefab is present, so you can use left mouse button to grab cube objects.

MinimalScene

This scene is a good starting point for a new RUIS scene. You can delete the Floor, Crate, Directional light, and MouseWand gameobjects.

Avatar controls

ControllableCharacter and MecanimBlendedCharacter

Keyboard

Gamepad

Move forward / backward

W / S

Left analog stick

Strafe left / right

A / D

Left analog stick

Turn left / right

Q / E

Right analog stick

Jump

Space

Joystick button 1, 5

Run

Shift

Joystick button 0, 4, 7

When Kinect and Jump Gesture are enable, you can jump in real life to make your avatar jump; You need to stand at least 2 meters away from the Kinect, and your both feet need to clearly lift from the ground.

Razer Hydra (RIGHT, hand-held)

PS Navigation controller (ID 5, hand-held)

Move forward / backward

Analog joystick

Analog joystick

Strafe left / right

Analog joystick

Analog joystick

Turn left / right

Buttons 3 / 4

X / O

Jump

Bumper button

L1

Run

Joystick button

L2

PS Move controller (GEM[1], hand-held)

Grab object

Trigger button

Trigger button

PS Move controller (GEM[2], hand-held)

Head tracking controls

Razer Hydra (LEFT, worn on left ear)

PS Move controller (GEM[0], worn above head)

Keyboard

Reset Oculus Rift yaw

Bumper + Start

Move button

Return key

Start / stop Rift’s automatic calibration

F5

Start / stop Rift’s manual calibration

F6

Show / hide Rift’s compass

F8

Below options are available in the Kinect + Razer Hydra mode

Pan left / right

Bumper + Analog joystick

Pan up / down

Bumper + Analog joystick

Increase base pitch

Bumper + Button 3

Decrease base pitch

Bumper + Button 1

You can start Oculus Rift’s magnetic drift correction process by pressing either F5 or F6 anytime during a scene. Alternatively, you can set the automatic or manual drift correction enabled from the menu (ESC), in which case the said process will start automatically each time you start the scene.


Troubleshooting

Kinect 1 issues

Kinect for Windows

Microsoft released Kinect 1 for Windows and Kinect 1 SDK, but they are not compatible with OpenNI. The kinect-mssdk-openni-bridge is an experimental module that connects Kinect SDK to OpenNI and allows people with Kinect for Windows to use OpenNI applications. This bridge _might_ get RUIS to work with Kinect for Windows but there are no guarantees:

https://code.google.com/p/kinect-mssdk-openni-bridge/

Razer Hydra

When starting a scene with Razer Hydra, its buttons sometimes get “stuck” and for example the MecanimBlendedCharacter moves automatically even without touching the buttons. If this happens, restarting the scene or unplugging and reconnecting the Razer Hydra USB cord can help. Razer Hydra can also sometimes get confused about directions or lose one controller altogether, in which case you need to restart the demo.

PS Move

Check that your computer and PlayStation are connected to the same network, and that the PlayStation is able to obtain an IP address. Make sure that the address for Move.me server and port in InputManager gameobject is the same as displayed in the Move.me software on PlayStation. Also make sure that “Load from file in editor” is disabled in the InputManager. If you successfully connect RUIS to Move.me server, the PlayStation3 screen should display something like this:

If RUIS for Unity is not able to connect to PlayStation via TCP (Move.me software displays “Connections: 0”), please check your firewall settings. If your application is connected to Move.me server but does not update PS Move state this may also be a firewall issue (Move.me sents PS Move state over UDP to RUIS) .

Unity editor and individual standalone executables have to be allowed through the firewall. In a standalone build you will have to set the IP address and port inside a file named inputConfig.txt that needs to be located in the same folder where the standalone executable file is. For an example of the file format please check the file provided in \RUISunity\.

In the current RUIS for Unity version boolean values that convey PS Navigation controller button states work poorly because the underlying PS Move wrapper that we use is buggy. This will be fixed soon.

Safety Warning

Wearing head-mounted-displays while standing up, moving, walking, or jumping is dangerous to your health and potentially deadly. Author of this software recommends you to avoid the aforementioned actions, and if you choose to perform them anyway, you do it at your own risk. The author of this software cannot be held responsible in any way for any consequences.

Software License Limitation of Liabilities

THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.