HTC Vive setup experiences

In November I got to borrow a HTC Vive dev kit for a few days. I was responsible for setting it up at a AEC hackathon in Helsinki. Before I share my experiences in detail, here are my two suggestions to Valve:

  • Allow developers to opt out from auto-updating SteamVR software. If I have a working demo configuration, I don’t want an automatic update to break it with a new plugin version etc.
  • Do not require Steam to run in the background of SteamVR. This is just common cleanliness, as it was a bit annoying that every time I started SteamVR it also launched Steam.

Overall the experience, particularly running the demos, was great. I already had tried the Aperture Robot Repair demo in August, but the display quality and tracking accuracy still made me very happy. I got very positive reactions from my colleagues in Aalto University to whom I showed the Vive for the first time.

Aalto University researchers trying out HTC Vive for the first time

Getting the dev kit to work took awhile. After connecting and placing the physical hardware, SteamVR wasn’t able to properly access the hardware no matter what I tried. And there were plenty of things to try, as can be seen in the SteamVR forum. It wasn’t until I updated the firmware for the HMD and the controllers that everything started working. I had to use rather unwieldy command line tools, whereas Oculus Rift DK2 had offered a simple firmware update process through the Oculus Configuration Utility.

When all the systems showed green status in SteamVR, creating and running a test VR scene in Unity Editor was a breeze. As a side note I’m happy to see that Unity is integrating VR functionality directly into their engine, which eases development for different VR platforms. I’d imagine that Epic is doing the same with Unreal Engine.

Occasionally the Vive stopped working after restarting my computer, and I needed to uninstall Vive’s USB drivers and reboot to solve the problem. According to Valve this issue is related to Vive’s HMD control box. From what I understand the vast majority of problems reported in SteamVR forum can be attributed to that. I believe that the situation will be much better with the 2nd Vive dev kit.

I didn’t have any problems with the tracking quality and everything ran smoothly. My only grievance is that I couldn’t install any of the cool Vive demos from Steam. Currently Valve has to separately set those privileges for each Steam account, and we couldn’t get them to do that in the short time-frame that we had. Instead I resorted to googling for unofficial, 3rd party Vive demos, which understandably had lower production values. For some reason I needed to run each demo in administrator mode to get them to work.

To summarize: this first HTC Vive dev kit, its hardware and parts of the software, feels like it was hacked together by a group of scientists in a lab. What actually happened is perhaps very close to that. In contrast Oculus DK1 and DK2 were slightly more polished, because as a pioneer Oculus had more to prove. This is not a complaint, and I’m quite happy that HTC and Valve decided to grant developers such an early access. HTC Vive, particularly its Lighthouse tracking, is just so good that it’s easy to overlook the lack of refinement in this early dev kit. I hope that soon I will get a permanent access to HTC Vive, so I can integrate it into my RUIS toolkit, enabling developers to combine room-scale Lighthouse tracking with full-body tracking of Kinect.

Posted in Virtual Reality | Tagged , , , , , , , , | Leave a comment

RUIS receives praising review

An article by researchers from The University of Louisiana at Lafayette reviews RUIS among with two other virtual reality toolkits for Unity. RUIS did very well in the review, and the original version of the article that I read in August stated that

with RUIS being free and highly versatile, it is the clear winner for low budget applications.

The author changed the wording in the final article version to “promising for low budget applications“, because their adviser suggested using a wording that sounds less biased. Oh well :-)

In the article RUIS reached almost the same score as MiddleVR (a professional $3000 toolkit) which came on top when price was not considered, as seen from the below table adapted from the article:

getReal3D MiddleVR RUIS
Performance & reliability 2 5 4
CAVE display flexibility 2 4 3
Interaction flexibility 2 4 5
Ease of use 4 3 3
VR applications 2 5 4
total  12 21 19

In the above table each category was given 1-5 points.

In terms of “Documentation and support” getReal3D scored 10, MiddleVR 20, and RUIS 14. Improving RUIS documentation and providing tutorials is in our todo list.

The article is slightly mistaken in that it says that top and bottom CAVE displays are not supported by RUIS, but this is not the case. The display wall center position, normal, and up vectors just need to be configured in RUISDisplay component. Please note that RUIS is mostly intended for CAVEs with a small number of displays because each view is rendered sequentially. For faster CAVE rendering in Unity you should probably try MiddleVR or getReal3D, which offer clustered rendering.

The article was published in International Journal for Innovation Education and Research (IJIR).

P.S. I participated in Burning Man 2015, where I demoed our Vertigo application at VR Camp. They had a dozen of computers with Oculus Rift DK2 and a HTC Vive. Here is a photo of me trying out Tilt Brush on HTC Vive.

2850TiltBrush

Posted in RUIS | Tagged , , , , , , | Leave a comment

Experiences from FIVR meeting

Yesterday I took part in FIVR (Finland VR) meeting and got the chance to try out HTC Vive and other VR technology that I haven’t tried before. The FIVR group started meeting autumn 2014 and within one year the number of participants grew from four people to 40. This is mostly thanks to the Finnish game company Mindfield Games who have been very active in organizing the events.

The meeting and our demo was featured in TV news and an online article by Finnish public broadcasting company YLE.

Valve’s Portal VR demo for HTC Vive is the best VR demo that I’ve seen (disclaimer: I haven’t tried Oculus Rift CV1 yet). While interaction-wise it’s nothing special, Valve’s high production values have resulted in an audio-visually beautiful piece in the familiar Aperture Science environment that is loved by many. A big part of the experience is the HTC Vive hardware that performs exceptionally well; The 110 degrees 2160×1200@90Hz HMD takes immersion one step further and I’m yet to see such precise and low-latency 6DOF tracking, all working in a room scale.
HTC Vive

Activity Core‘s “Sensored Swissball” was a surprisingly fun control device! It’s perfect for all kinds of, uh, riding interaction.
Activity Core

Virtual Boy by Nintendo! 20 year old VR technology. Wario Land is actually a pretty good game.Virtual Boy

Posted in Virtual Reality | Tagged , , , , , | Leave a comment

RUIS for Unity 5 released!

An unofficial RUIS for Unity 5 patch has been out for a few months, and we’re finally putting out the official release. It includes a bunch of bug fixes and improvements:

  • Thickness of individual limbs can be scaled, creating the appearance that your avatar is gaining or losing weight.
  • Scaling Kinect controlled avatars works properly now, also when combined with Oculus Rift DK2 head tracking. Above video shows you an example of this.
  • Kinect 2 wrist rotation tracking can be disabled (the tracking is rather unstable and can be distractive).
  • Etc.

Get the RUIS for Unity toolkit at http://blog.ruisystem.net/download/

We don’t have HTC Vive yet, but hope to get one in the future. We are still working on a Kinect 2 + Oculus Rift demo, and after that we’re planning to modify the RUIS code architecture so that it will be easier to add support for upcoming VR input devices (e.g. Oculus Touch, Sixense STEM).

Posted in RUIS | Tagged , , , , , , , | Leave a comment

Using RUIS in Unity 5

MuscleDemo

An upcoming Kinect 2 + Oculus Rift DK2 demo created with RUIS and Unity 5 (no release date yet).

Unity 5 allows the use of Kinect 2 and Razer Hydra without a Pro license, so it makes sense to update your RUIS project to Unity 5. In May June we will put out an official RUIS for Unity 5 release. Meanwhile you can use the below guide to upgrade RUIS to work in Unity 5.

How to update RUIS 1.07 to Unity 5
Download the RUISunity1071_Unity5.unitypackage from here:
https://drive.google.com/file/d/0B0dcx4DSNNn0bTR2Tm5RSkZhRmM/view?usp=sharing

Update instructions
1. Create backup of your project.
2. If you have modified any RUIS scripts, prefabs, or scenes, you need to create duplicates of them because RUIS files with the original names will be overwritten by the RUIS update.
3. Install Unity 5. If you are a Windows user and intend to use Kinect 1, you should install the 32-bit Editor (Additional Downloads, For Windows link):
http://unity3d.com/get-unity/download?ref=personal
4. Open your project with Unity 5, and Upgrade the project when Unity asks to do so.
5. When “API Update Required” appears, choose the option “I Made a Backup, Go Ahead!”.
6. The project should be open now. Ignore any errors in Console, and open the RUISunity1071_Unity5.unitypackage file in Explorer/Finder and import everything.
7. After importing, delete the following files and bundles in \Assets\Plugins: KinectForUnity.dll, libOculusPlugin.so, OculusPlugin.dll, sixense.dll, sixense.bundle. Do NOT touch any files in the Android, Metro, OculusPlugin.bundle, x86, and x86_64 subfolders.
8. Everything should work now, unless some of your own scripts or assets are broken.

P.S. If anyone knows a math library for C# that supports (or can be easily modified to support) in-place matrix operations for multiplication, addition, inversion, and transposition of 4-by-4 matrices, let me know! The current matrix library that we are using (and Unity’s Matrix4x4) allocates new memory after every matrix operation, which causes very frequent garbage collection and results in a noticeable performance loss when Kinect rotation smoothing is enabled in RUIS.

Posted in RUIS | Tagged , , , , , | Leave a comment

RUIS 1.07 released! Fist gesture and more

Aalto University’s course for virtual reality started earlier this month, and to get things rolling we have just released the latest version of RUIS for Unity. RUIS 1.07 adds new features and fixes many issues of our previous release, which we admittedly rushed out to be in time for the Spatial User Interaction 2014 conference in October. Among other things, the bug fixes of RUIS 1.07 restore positional head tracking functionality for Oculus Rift DK1 using Kinect 1/2, PS Move, and Razer Hydra.

For Kinect 2 we have added avatar joint filtering and a fist gesture that can be used to grab and manipulate objects. Developers with Oculus Rift but without Kinect get to choose whether the Rift’s orientation rotates only the avatar head, the whole body, or the walk forward direction.

Kinect 2 tracked player grabs the hammer with a fist gesture.

Kinect 2 tracked player grabs the hammer with a fist gesture.

Unity Free users can now use Oculus Rift, but they need to import OculusUnityIntegration.unitypackage and overwrite the existing files. Due to the premature nature of the Oculus Unity integration, there are also other considerations that you can check from the “Known issues” section of our readme. These issues, including “judder” in Unity Editor, should be alleviated as new versions of Oculus SDK will be released in the future.

We’ve performed appropriate testing this time, and RUIS for Unity 1.07 holds together well and is shaping up nicely. There is still jaggedness in the motion of Kinect controlled avatars and motion controllers, that seems to be related to Unity frame update and the irregular device refresh rate. We’ll hope to fix that for the next release.

Posted in RUIS | Tagged , , , , , , | 1 Comment

Photos from 2014 Virtual Reality Course

Below you see VR applications created by my students with our RUIS for Unity toolkit. Most of the applications featured Oculus Rift DK1, Kinect, and PlayStation Move controllers.

This year and in 2015 the virtual reality course is organized in Aalto University under the name Experimental User Interfaces. A new course is starting in January of 2015.

Wheelchair Hero

An empowering first-person game using Oculus Rift, where the player controls a wheelchair by spinning wheels that have PS Move controllers attached to them.

An empowering first-person game using Oculus Rift, where the player controls a wheelchair by spinning wheels that have PS Move controllers attached to them.

Flying Game

A two player co-operative game, where the player with Oculus Rift and a rifle sneaks around a city, using a laser to light up targets which can be destroyed by the second player who pilots an attack helicopter.

A two player co-operative game, where the player with Oculus Rift and a rifle sneaks around a city, using a laser to light up targets which can be destroyed by the second player who pilots an attack helicopter.

Virtual Curling

Two player co-operative curling simulator where one player is a curler who "throws" stones, and the another player acts as a sweeper, affecting the trajectory of the stones while it slides on ice.

Two player co-operative curling simulator where one player is a curler who “throws” stones, and the another player acts as a sweeper, affecting the trajectory of the stones while it slides on ice.

Lazerzilla

The player's avatar is a giant cyber-lizard, who uses his claws and "laser breath" to destroy skyscrapers while fighting human soldiers, tanks, and helicopters.

The player’s avatar is a giant cyber-lizard, who uses his claws and “laser breath” to destroy skyscrapers while fighting human soldiers, tanks, and helicopters.

COVRSCPG

A co-operative two player game in the spirit of Super Monkey Ball and Marble Madness; one player is controlling a size-varying ball from a first-person Oculus Rift view, while the other player uses a god-view to help him advance in obstacle courses.

A co-operative two player game in the spirit of Super Monkey Ball and Marble Madness; one player is controlling a size-varying ball from a first-person Oculus Rift view, while the other player uses a god-view to help him advance in obstacle courses.

Runner

Two players compete on who can travel further on a snowy path filled with dangers.

Two players compete on who can travel further on a snowy path filled with dangers.

All you see above is created by students who have no or very little experience in creating VR applications. Five out of six applications featured two different display systems: Oculus Rift and two stereo 3D screens (for audience and/or second player).

In the course the students were free to create any kind of applications, and for some reason everyone chose to develop games :-)

Posted in Teaching | Tagged , , , , , , | Leave a comment

Oculus Rift DK2 and Kinect 2 support added!

Head over to the download section to get the latest RUIS for Unity version!

kinect2dk2calibration

We have also added a process to calibrate the transformation matrix between several different sensor pairs (see above image). This enables to use Kinect 1, Kinect 2 (Win8 only), Oculus DK2, and PS Move in the same coordinate system even if the individual sensors have some distance between them or are oriented into different directions (the sensors’ view frustums need to partially overlap though), In other words, if you have calibrated Oculus Rift DK2 and Kinect 2, the Kinect 2 avatar’s head and body is correctly aligned with the head tracked position of Oculus Rift DK2 when you are using RUIS prefabs, and you will see your whole body in virtual reality!

There are still some issues that will be fixed for the next RUIS release. For example, Kinect 2 joint data is not smoothed yet, and the joints have a noticeable amount of jitter. The “Known issues” section in RUIS’ readme file lists a few more rough edges.

 

Posted in RUIS | Tagged , , , , | 7 Comments

Oculus Rift Developer Kit 2 has arrived!

We just received our Oculus Rift Development Kit 2 (DK2) head-mounted-display and are thrilled to report our experiences with it.

Rift DK2 in front left, its position tracking camera in front center, and two Kinect 2 sensors behind them.

Rift DK2 in front left, its position tracking camera in front center, and two Kinect 2 sensors behind them.

The DK2 comes bundled with an infrared webcam that tracks the Rift’s position (and most likely helps to correct yaw drift in orientation as well).  My first question upon unboxing DK2 was “Where the infrared LEDs at??”

So I pointed Kinect 2’s infrared camera at it, and took the below picture:

OculusDK2Infrared

The LEDs appear as overexposed white blobs in the infrared image.

It seems that the LEDs are below the Rift’s exterior, which is made of (special?) plastic that lets through IR spectrum but absorbs visible light, hiding the nasty insides.

DK2 demo experiences

After solving the LED mystery, we tried the following demos:

Oculus demo scene is best for checking out the tracking and image quality, as the scene is peaceful and its 3D objects are simple and elegant. Cyber Space is a virtual amusement park ride for those of us who want to explore our cyber-sickness limits, Horse World Online is only for the most hard-core horsie fans, and Chilling Space has a calm atmosphere (we didn’t notice how positional tracking was employed though).

DK2 has been out for a relatively short time, and I’m not aware of any killer apps for it yet. Personally I’m looking forward to the DK2 version of the Senza Peso opera.

In many ways Oculus Rift DK2 is superior to DK1: head position tracking is responsive and accurate, which is integral to immersion and minimizing nausea. While the screen door (pixel border) effect is still noticeable, it’s a minor nuisance as there are major improvements in other areas. DK2’s resolution is higher, its OLED-display produces a better color range, and its image is crisp because the motion blur from slow pixel color change times has improved (except for blacks). The tracking latency is still low as it should be, and the low persistence technique really seems to do the trick, considerably reducing cyber-sickness.

WP_20140801_001

Meant for each other?

That’s it for now, we’ll get back to combining DK2 with Kinect 2! It’s wonderful stuff, keep your eyes on us!

Posted in Virtual Reality | Tagged , , , , , | 5 Comments

VR2014 conference highlights: Tactical Haptics and more

I participated in the IEEE Virtual Reality 2014 conference that was held between March 29th – April 2nd in Minneapolis. Eager beavers can jump straight to the below link to see a list of the best papers and demos at the conference:
http://ieeevr.org/2014/awards.html

VR works better with drugs (pain relief)

VR works better with drugs (pain relief). Timothy Leary approves.

The biggest VR2014 highlight for me was trying out Tactical Haptics‘ Reactive Grip prototype:

Sense of touch is one of the major senses and perhaps the most challenging sense to provide with convincing virtual sensations. Currently haptic feedback is missing from most virtual reality applications. Reactive Grip could change that for many applications: it is cheap and simple haptic technology that could be integrated into any number of modern game controller variations. The handle of Reactive Grip utilizes four sliding contactor plates whose movement conveys the sense of inertia from the virtual object. Examples include gun recoil (kickback), struggle of a fish caught by a fishing rod, or hit of a sword against another virtual object. I tried bunch of demos that included those examples. Tactical Haptics have close ties with Sixense, and in the prototype motion tracking is handled via Razer Hydra controllers.
2377TacticalHapticsTwoHands

Reactive Grip has its limitations: because it is a game controller, the Tactical Haptics’ device can only approximate sensations from rigid, hand-held objects such as virtual gun grips, fishing rods, steering wheels, and other tools. For most games and applications this should be enough though. Reactive Grip is a mechanical device and I wonder how robust it can be. Traditionally haptic devices break easily.

Funny thing is that if you close your eyes when using the controllers, the haptic feedback alone doesn’t convey what you are doing in the virtual world, due to the vagueness and low fidelity of the haptic effect. But when combined with audiovisual cues, the different perceptions merge together gracefully providing more immersion than any of the cues alone. Most importantly, the haptic feedback doesn’t contradict the audiovisual cues but rather supports them.
2372TacticalHapticsFishing

Tactical Haptics ran a Kickstarter campaign last Autumn that unfortunately didn’t go through. People really need to try out this controller to see its potential. Tactical Haptics’ invention is something that for the first time could bring haptic feedback to the masses, especially if one of the major console manufacturers would adopt it.

The acquisition of Oculus VR by Facebook was a big news topic throughout the conference. As such it was a pity that we didn’t get to see Crystal Cove or DK2 prototype of Rift. Vicon was in talks with Oculus VR to bring DK2 to the conference, but at the time Oculus canceled public demonstrations of DK2 due to the Facebook buyout. That’s what I heard anyway. Palmer Luckey was also supposed to participate in the conference, but apparently the Facebook acquisition and the related death threats to Oculus staff got in the way.

Several times I witnessed Oculus’ HMD referred as Facebook Rift and FaceRift. Perhaps there was slight bitterness in the air regarding the 2 billion dollar buyout? This is understandable as traditionally VR hasn’t been a very lucrative business, and suddenly seasoned VR researchers and practitioners see a VR company go from zero to a hero in less than two years.

I talked to a person who had tried Sony’s Morpheus, DK2, and Valve’s prototype. In his opinion DK2 and Morpheus were very close to each other performance-wise. He liked Valve’s prototype the best though, because of the wide positional tracking that was implemented with camera-based inside-out-tracking of fiducial markers. With Michael Abrash joining Oculus, hopefully the good features of Valve’s prototype will be transferred to future Oculus HMDs.

University of Minnesota presented a bunch of their VR related projects to the conference audience. The most interesting one was a high-resolution, wide-FOV HMD built from an iPad mini and a 3D printed frame. In their demo up to 6-8 people wore the HMDs, dwelling the same VR place simultaneously, while being tracked over a large area using a commercial optical tracker.
2345iPadMiniHMDs

The HMD utilized high-quality glass optics (~$40 a piece) to spread the iPad mini’s 2048-by-1536 resolution to a FOV that similar to Oculus Rift’s. Needless to say the image was much crispier than with Rift, whereas the iPad’s orientation tracking was slightly less responsive than that of the Rift. Overall, I was very impressed with this HMD!
2330iPadMiniHMD

No virtual prostate examinations this year, but at least we got to probe some chest cavities.
2353MinnesotaUniversityMedicalSclices

After the conference it was time to get back to basics.
2371BackToClassics

P.S. I also visited Kinect 2 Developer Preview Program Breakfast that was co-organized with Microsoft’s Build Conference in San Francisco. Microsoft hopes to start selling Kinect 2 for Windows in the summer, and us developers with the preview version should get a Kinect 2 Unity plugin even before that.
xMicrosoft

Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment