In the past months we’ve been adding new features and Oculus Rift and Razer Hydra support in RUIS for Unity. Our just released TurboTuscany demo showcases the new capabilities of RUIS:
Video part 2:
TurboTuscany features a 1st person view with a Kinect controlled full-body avatar and 4 methods for six-degrees-of-freedom (6DOF) head tracking:
- Oculus Rift + Kinect
- Oculus Rift + Razer Hydra
- Oculus Rift + Razer Hydra + Kinect
- Oculus Rift + PlayStation Move (+ Kinect)
It makes a difference to see your own body in the virtual world, affecting the sense of presence. Those of you with Kinect: Try pushing and kicking stuff, or even climb the ladder with your hands. You can take steps freely inside Kinect’s range, and when you need to go further, just use a wireless controller to walk or run like you would in any normal game. We are blending your Kinect captured pose with Mecanim animation, so while you’re running and your feet follow a run animation clip, you can still flail around your hands and upper body as you like.
(Kinect users will need to install Win32-bit version of OpenNI 1.5.4.0. See the readme that comes with the demo for other details.)
Positional head tracking with Kinect alone is quite rough, so try the Razer+Kinect or PS Move option if you can.
Minimum requirements
- Oculus Rift
- Windows operating system: Vista, Windows 7, Windows 8 (see comment section for details)
Supported input devices
- Razer Hydra
- ASUS Xtion Pro, Kinect for Xbox, (Kinect for Windows?)
- PlayStation Move and PS Navigation controllers (Move.me software and PS3 required)
- Gamepad (any Unity compatible gamepad or joystick)
- Mouse and keyboard
This demo should be pretty fun to try out even with just mouse and keyboard. There’s several physics based activities, and we’ve hidden a bunch of Easter eggs in the scene.
Download links:
https://drive.google.com/file/d/0B0dcx4DSNNn0UEF0U2RMblNjelU/edit?usp=sharing
https://dl.dropboxusercontent.com/u/8247026/TurboTuscany101.zip
http://www.mediafire.com/?77ardrkg4okwhua
1080p version:
https://drive.google.com/file/d/0B0dcx4DSNNn0T2NaRFdNekkwUXM/edit?usp=sharing
Within a month or two we will release a new version of RUIS for Unity that will have all the features that we used to create TurboTuscany demo.
Are you going to support OpenNi 2?
RUIS for Unity will possibly support OpenNI 2 via ZigFu in the future. Even better would be if PrimeSense made their own OpenNI 2 unity plugin with less restrictive license. Right now the focus is on Kinect 2 (hopefully will get the beta version this Autumn).
Our development is a bit slow since it’s just 2 guys working on our little spare time (we just used the last of our funding).
Windows 8:
Windows 8 doesn’t like unsigned drivers. People with Windows 8 have successfully installed OpenNI and gotten TurboTuscany to work with the following procedure:
Uninstall OpenNi, Nite, and the Kinect driver
Windows key + R to open the run prompt
shutdown.exe /r /o /f /t 00
Select troubleshoot
Select advanced
Select Windows startup and then restart
Enter the option for Disable Driver Signature
Reinstall OpenNi (32-bit version), Nite, and the Kinect driver.
“I followed those instructions and it worked for me I suspect that only the driver needs to be re-installed, I also had to go in to device manager and make sure it was pointed at the right drivers because I had installed the oppenni sdk drivers and they don’t work with processing.”
Kinect for Windows:
Microsoft released Kinect for Windows and Kinect SDK, but they are not compatible with OpenNI. The kinect-mssdk-openni-bridge is an experimental module that connects Kinect SDK to OpenNI and allows Kinect for Windows users use OpenNI applications. This bridge _might_ get TurboTuscany demo to work with Kinect for Windows:
https://code.google.com/p/kinect-mssdk-openni-bridge/
If you have problems getting Kinect to work, run TurboTuscany in Administrator mode.
Installing Kinect drivers & Software:
Before installing, make sure that you have uninstalled all OpenNI, NITE, Primesense, and SensorKinect instances from Control Panel’s “Uninstall a Program” section. Reboot and install the following files in the following order:
http://www.openni.org/wp-content/uploads/2012/12/OpenNI-Win32-1.5.4.0-Dev1.zip
http://www.openni.org/wp-content/uploads/2012/12/NITE-Win32-1.5.2.21-Dev.zip
https://github.com/avin2/SensorKinect/raw/unstable/Bin/SensorKinect093-Bin-Win32-v5.1.2.1.msi
http://www.openni.org/wp-content/uploads/2012/12/Sensor-Win32-5.1.2.1-Redist.zip
If the links have gone dead, google for those file names.
I’ve got two questions, if you don’t mind answering them:
1) Does the grandma skin model use the same exact base skeletal structure that a Kinect tracks? Or do you translate the kinect motion tracking to a more simplified skeleton
2) You use either a Hydra or a PS Move attached to the head. Is this to say that Kinect motion tracking is prone to heavy error when tracking the head’s linear (non-rotational) motion?
1) No, the grandma model uses a more complex skeletal structure that came with it when we downloaded it from Mixamo. In our RUIS for Unity toolkit the developer can use pretty much any human-like model and map it to Kinect. The current RUIS toolkit package in the download section is buggy and without documentation, but we intend to upload a more polished version by 10th of February.
2) Yes, exactly.
Shouldn’t it be possible to attach two calibrated flat cameraboards with fisheye lenses on to the front of the OR? That could be useful to turn the OC into a heads-up display that also reflects the real scene on front of the user. And you could attach a PrimeSense board between it to get additional 3D data.
Looking into your real room and turning it into a Command and Conquer battlefield could be a next step. You command your units around with hand gestures and voice commands, placing buildings on the couch, connecting it to the coffeeetable with a virtual bridge, etc. … Endless possibilities.
I was an adviser in a master thesis work where two fisheye cameras were used with Oculus Rift, it was pretty sweet.
Researchers have combined head-mounted-displays with real-time Kinect 3D reconstruction. For now I could find only this video, where only the hands are reconstructed with Kinect:
https://www.youtube.com/watch?v=R0-dsbeasgA&t=6m11s
But yeah, this is all cool stuff 🙂
I’m wondering something about the wonderful grandma model and how it moves. I see there’s a walking animation attached to it, for when you’re moving about. I see that you can make her walk using either gestures with your arms or pushing buttons on a controller. Yet, when you kick aside that barrel with your legs, your Kinect-tracked legs become the main controller of the model. Is this a completely trivial thing or did you have to introduce conditional statements to check when to apply either animation or map skeleton tracking to the model?
The “long distance walking” in the demo is indeed triggered with controller buttons (PS Navigation controller / Razer Hydra / gamepad). Most of the time the character is 100% Kinect controlled, but when one of those walk buttons is pressed, then our scripts start blending in a walking animation loop on the legs of the character. Personally I see this as a viable alternative to the cumbersome VR treadmill controllers.
This was not trivial to implement even with Unity’s Mecanim (using only animation layers was not enough), because in run-time our script dynamically creates two copies of the skeleton rig (Kinect animated and walk cycle animated) that we blend into a third rig on every frame update.
Now that we have created such functionality, anyone can just download RUIS for Unity, and use the MecanimBlendedCharacter prefab (see OculusRiftExample or KinectTwoPlayers scenes). You can even replace the default Constructor guy model with your own human model, although some joint position tuning is most likely required if you want the Kinect controlling to be just right.
How you can detect the walking gesture in unity3d with this package?
Currently there is no walking gesture in RUIS. I think you’re referring to this part of the video:
https://www.youtube.com/watch?v=-YYiTkf3sDs&t=0m50s
There I’m using an Xbox 360 controller to make the character walk (in other parts I use PS Navigation controller and Razer Hydra). I was just moving my arms for show 🙂
You could try creating your own walking gesture algorithm that tracks the Kinect hand positions.