Thoughts on Microsoft Hololens

In May I had the opportunity to try out Microsoft Hololens. It had a phenomenal inside-out positional tracking, which felt very robust. As reported widely online, its field of view is very limited. That is the single biggest obstacle for usability and immersion. As a self-contained wearable display device, Hololens is a great “development kit” for augmented reality developers to start experimenting with the technology. I believe that it will be useful in a number of real-world cases, despite the narrow field of view.

I was surprised that the interactive cursor was locked to the center of the display, and could only be moved only by rotating my head. I was expecting to be able to relocate the cursor by moving my hand in front of the device, because hand gestures are also used for “clicking” and bringing up the menu. Hololens also comes with a wireless clicker peripheral that can be used instead of the gestures. That would be my preferred way for interaction due to the clicker being more robust and ergonomic. Perhaps the “locked” cursor is a good idea after all, for those same reasons.

Trying out Hololens

Coming RUIS for Unity Update

A few words about the future update of RUIS for Unity: The currently distributed version 1.082 still requires Oculus Runtime 1.06, which is obsolete and does not support Oculus Rift CV1. I have created a beta version of RUIS that supports HTC Vive, which I used for adding Vive support for the Vertigo experience. I have not made that version public, because it is still very much a hack. I’m waiting for Unity to release stable version of Unity 5.4, which eases my job by adding native support to Vive and unifies the head-mounted display interface.

I have submitted my PhD thesis (about virtual reality) for review, but I still have a bunch of other projects I’m working on. Therefore the new RUIS version will probably come out in July or August. It’s worth the wait 🙂

Posted in Augmented Reality | Tagged , , , , , , | Leave a comment

Immersive Journalism

I was going through my old photos and found something that I should have blogged years ago.

You see, I met the ”Godmother of Virtual Reality” Nonny de la Peña at IEEE VR2013 conference, and we talked about her work. At that point I had already heard about her Guantanamo Bay detainee VR experience, where the user has to endure in a stress position while hearing “interrogation noises” from the next room.

What a way to put yourself in another person’s shoes! And that is the idea of immersive journalism, a concept coined by de la Peña. You could watch the news from a small box in your room, or you could experience the news in first person with the aid of virtual reality.

I liked Nonny’s ideas and asked to see more of her work. She was very hospitable and in March 2013 she gave me and my friend a tour at the USC Institute for Creative Technologies (ICT) in Los Angeles.

Hunger in LAFirst Nonny showed us her production Hunger in LA. It’s an immersive journalism piece where the driving force is real audio recorded at Los Angeles food bank. At the time of the recording there were delays in food distribution, it started to get crowded, one person had a seizure, and ambulance had to be called.

Novice experiencing Hunger in LANonny described that some viewers of Hunger in LA had been so touched by the experience that they cried. This didn’t happen with me, or my friend who was a VR novice. Perhaps we were all too jaded for that. But the use of real, non-acted audio was very moving, and I can see how people could have strong reactions for such authentic content.

Trying out Hunger in LAThere were all sorts of set pieces in the ICT laboratory as seen in the background of the above photo. Apparently they have been working closely with US army, exploring military training applications of virtual reality.

ICT labWe got to see different parts of the laboratory. Above is a lab desk full of prototypes at USC ICT, where Palmer Luckey worked as a lab technician with FOV2GO head-mounted display.

DK1 in March 2013Oculus had given DK1s to ICT before the official shipping date of March 29th 2013.

Mobile VR prototypeThis was one of the mobile head-mounted display prototypes that we tried.

Lightstage at ICTWe also got a chance to see ICT’s Light Stage, which has been used to capture 3D scans of actors for several movies. Many thanks to Nonny de la Peña for giving us the tour and sharing her work!

Posted in Virtual Reality | Tagged , , , , , | Leave a comment

HTC Vive setup experiences

In November I got to borrow a HTC Vive dev kit for a few days. I was responsible for setting it up at a AEC hackathon in Helsinki. Before I share my experiences in detail, here are my two suggestions to Valve:

  • Allow developers to opt out from auto-updating SteamVR software. If I have a working demo configuration, I don’t want an automatic update to break it with a new plugin version etc.
  • Do not require Steam to run in the background of SteamVR. This is just common cleanliness, as it was a bit annoying that every time I started SteamVR it also launched Steam.

Overall the experience, particularly running the demos, was great. I already had tried the Aperture Robot Repair demo in August, but the display quality and tracking accuracy still made me very happy. I got very positive reactions from my colleagues in Aalto University to whom I showed the Vive for the first time.

Aalto University researchers trying out HTC Vive for the first time

Getting the dev kit to work took awhile. After connecting and placing the physical hardware, SteamVR wasn’t able to properly access the hardware no matter what I tried. And there were plenty of things to try, as can be seen in the SteamVR forum. It wasn’t until I updated the firmware for the HMD and the controllers that everything started working. I had to use rather unwieldy command line tools, whereas Oculus Rift DK2 had offered a simple firmware update process through the Oculus Configuration Utility.

When all the systems showed green status in SteamVR, creating and running a test VR scene in Unity Editor was a breeze. As a side note I’m happy to see that Unity is integrating VR functionality directly into their engine, which eases development for different VR platforms. I’d imagine that Epic is doing the same with Unreal Engine.

Occasionally the Vive stopped working after restarting my computer, and I needed to uninstall Vive’s USB drivers and reboot to solve the problem. According to Valve this issue is related to Vive’s HMD control box. From what I understand the vast majority of problems reported in SteamVR forum can be attributed to that. I believe that the situation will be much better with the 2nd Vive dev kit.

I didn’t have any problems with the tracking quality and everything ran smoothly. My only grievance is that I couldn’t install any of the cool Vive demos from Steam. Currently Valve has to separately set those privileges for each Steam account, and we couldn’t get them to do that in the short time-frame that we had. Instead I resorted to googling for unofficial, 3rd party Vive demos, which understandably had lower production values. For some reason I needed to run each demo in administrator mode to get them to work.

To summarize: this first HTC Vive dev kit, its hardware and parts of the software, feels like it was hacked together by a group of scientists in a lab. What actually happened is perhaps very close to that. In contrast Oculus DK1 and DK2 were slightly more polished, because as a pioneer Oculus had more to prove. This is not a complaint, and I’m quite happy that HTC and Valve decided to grant developers such an early access. HTC Vive, particularly its Lighthouse tracking, is just so good that it’s easy to overlook the lack of refinement in this early dev kit. I hope that soon I will get a permanent access to HTC Vive, so I can integrate it into my RUIS toolkit, enabling developers to combine room-scale Lighthouse tracking with full-body tracking of Kinect.

Posted in Virtual Reality | Tagged , , , , , , , , | Leave a comment

RUIS receives praising review

An article by researchers from The University of Louisiana at Lafayette reviews RUIS among with two other virtual reality toolkits for Unity. RUIS did very well in the review, and the original version of the article that I read in August stated that

with RUIS being free and highly versatile, it is the clear winner for low budget applications.

The author changed the wording in the final article version to “promising for low budget applications“, because their adviser suggested using a wording that sounds less biased. Oh well 🙂

In the article RUIS reached almost the same score as MiddleVR (a professional $3000 toolkit) which came on top when price was not considered, as seen from the below table adapted from the article:

getReal3D MiddleVR RUIS
Performance & reliability 2 5 4
CAVE display flexibility 2 4 3
Interaction flexibility 2 4 5
Ease of use 4 3 3
VR applications 2 5 4
total  12 21 19

In the above table each category was given 1-5 points.

In terms of “Documentation and support” getReal3D scored 10, MiddleVR 20, and RUIS 14. Improving RUIS documentation and providing tutorials is in our todo list.

The article is slightly mistaken in that it says that top and bottom CAVE displays are not supported by RUIS, but this is not the case. The display wall center position, normal, and up vectors just need to be configured in RUISDisplay component. Please note that RUIS is mostly intended for CAVEs with a small number of displays because each view is rendered sequentially. For faster CAVE rendering in Unity you should probably try MiddleVR or getReal3D, which offer clustered rendering.

The article was published in International Journal for Innovation Education and Research (IJIR).

P.S. I participated in Burning Man 2015, where I demoed our Vertigo application at VR Camp. They had a dozen of computers with Oculus Rift DK2 and a HTC Vive. Here is a photo of me trying out Tilt Brush on HTC Vive.

2850TiltBrush

Posted in RUIS | Tagged , , , , , , | Leave a comment

Experiences from FIVR meeting

Yesterday I took part in FIVR (Finland VR) meeting and got the chance to try out HTC Vive and other VR technology that I haven’t tried before. The FIVR group started meeting autumn 2014 and within one year the number of participants grew from four people to 40. This is mostly thanks to the Finnish game company Mindfield Games who have been very active in organizing the events.

The meeting and our demo was featured in TV news and an online article by Finnish public broadcasting company YLE.

Valve’s Portal VR demo for HTC Vive is the best VR demo that I’ve seen (disclaimer: I haven’t tried Oculus Rift CV1 yet). While interaction-wise it’s nothing special, Valve’s high production values have resulted in an audio-visually beautiful piece in the familiar Aperture Science environment that is loved by many. A big part of the experience is the HTC Vive hardware that performs exceptionally well; The 110 degrees 2160×1200@90Hz HMD takes immersion one step further and I’m yet to see such precise and low-latency 6DOF tracking, all working in a room scale.
HTC Vive

Activity Core‘s “Sensored Swissball” was a surprisingly fun control device! It’s perfect for all kinds of, uh, riding interaction.
Activity Core

Virtual Boy by Nintendo! 20 year old VR technology. Wario Land is actually a pretty good game.Virtual Boy

Posted in Virtual Reality | Tagged , , , , , | Leave a comment

RUIS for Unity 5 released!

An unofficial RUIS for Unity 5 patch has been out for a few months, and we’re finally putting out the official release. It includes a bunch of bug fixes and improvements:

  • Thickness of individual limbs can be scaled, creating the appearance that your avatar is gaining or losing weight.
  • Scaling Kinect controlled avatars works properly now, also when combined with Oculus Rift DK2 head tracking. Above video shows you an example of this.
  • Kinect 2 wrist rotation tracking can be disabled (the tracking is rather unstable and can be distractive).
  • Etc.

Get the RUIS for Unity toolkit at http://blog.ruisystem.net/download/

We don’t have HTC Vive yet, but hope to get one in the future. We are still working on a Kinect 2 + Oculus Rift demo, and after that we’re planning to modify the RUIS code architecture so that it will be easier to add support for upcoming VR input devices (e.g. Oculus Touch, Sixense STEM).

Posted in RUIS | Tagged , , , , , , , | Leave a comment

Using RUIS in Unity 5

MuscleDemo

An upcoming Kinect 2 + Oculus Rift DK2 demo created with RUIS and Unity 5 (no release date yet).

Unity 5 allows the use of Kinect 2 and Razer Hydra without a Pro license, so it makes sense to update your RUIS project to Unity 5. In May June we will put out an official RUIS for Unity 5 release. Meanwhile you can use the below guide to upgrade RUIS to work in Unity 5.

How to update RUIS 1.07 to Unity 5
Download the RUISunity1071_Unity5.unitypackage from here:
https://drive.google.com/file/d/0B0dcx4DSNNn0bTR2Tm5RSkZhRmM/view?usp=sharing

Update instructions
1. Create backup of your project.
2. If you have modified any RUIS scripts, prefabs, or scenes, you need to create duplicates of them because RUIS files with the original names will be overwritten by the RUIS update.
3. Install Unity 5. If you are a Windows user and intend to use Kinect 1, you should install the 32-bit Editor (Additional Downloads, For Windows link):
http://unity3d.com/get-unity/download?ref=personal
4. Open your project with Unity 5, and Upgrade the project when Unity asks to do so.
5. When “API Update Required” appears, choose the option “I Made a Backup, Go Ahead!”.
6. The project should be open now. Ignore any errors in Console, and open the RUISunity1071_Unity5.unitypackage file in Explorer/Finder and import everything.
7. After importing, delete the following files and bundles in \Assets\Plugins: KinectForUnity.dll, libOculusPlugin.so, OculusPlugin.dll, sixense.dll, sixense.bundle. Do NOT touch any files in the Android, Metro, OculusPlugin.bundle, x86, and x86_64 subfolders.
8. Everything should work now, unless some of your own scripts or assets are broken.

P.S. If anyone knows a math library for C# that supports (or can be easily modified to support) in-place matrix operations for multiplication, addition, inversion, and transposition of 4-by-4 matrices, let me know! The current matrix library that we are using (and Unity’s Matrix4x4) allocates new memory after every matrix operation, which causes very frequent garbage collection and results in a noticeable performance loss when Kinect rotation smoothing is enabled in RUIS.

Posted in RUIS | Tagged , , , , , | Leave a comment

RUIS 1.07 released! Fist gesture and more

Aalto University’s course for virtual reality started earlier this month, and to get things rolling we have just released the latest version of RUIS for Unity. RUIS 1.07 adds new features and fixes many issues of our previous release, which we admittedly rushed out to be in time for the Spatial User Interaction 2014 conference in October. Among other things, the bug fixes of RUIS 1.07 restore positional head tracking functionality for Oculus Rift DK1 using Kinect 1/2, PS Move, and Razer Hydra.

For Kinect 2 we have added avatar joint filtering and a fist gesture that can be used to grab and manipulate objects. Developers with Oculus Rift but without Kinect get to choose whether the Rift’s orientation rotates only the avatar head, the whole body, or the walk forward direction.

Kinect 2 tracked player grabs the hammer with a fist gesture.

Kinect 2 tracked player grabs the hammer with a fist gesture.

Unity Free users can now use Oculus Rift, but they need to import OculusUnityIntegration.unitypackage and overwrite the existing files. Due to the premature nature of the Oculus Unity integration, there are also other considerations that you can check from the “Known issues” section of our readme. These issues, including “judder” in Unity Editor, should be alleviated as new versions of Oculus SDK will be released in the future.

We’ve performed appropriate testing this time, and RUIS for Unity 1.07 holds together well and is shaping up nicely. There is still jaggedness in the motion of Kinect controlled avatars and motion controllers, that seems to be related to Unity frame update and the irregular device refresh rate. We’ll hope to fix that for the next release.

Posted in RUIS | Tagged , , , , , , | 1 Comment

Photos from 2014 Virtual Reality Course

Below you see VR applications created by my students with our RUIS for Unity toolkit. Most of the applications featured Oculus Rift DK1, Kinect, and PlayStation Move controllers.

This year and in 2015 the virtual reality course is organized in Aalto University under the name Experimental User Interfaces. A new course is starting in January of 2015.

Wheelchair Hero

An empowering first-person game using Oculus Rift, where the player controls a wheelchair by spinning wheels that have PS Move controllers attached to them.

An empowering first-person game using Oculus Rift, where the player controls a wheelchair by spinning wheels that have PS Move controllers attached to them.

Flying Game

A two player co-operative game, where the player with Oculus Rift and a rifle sneaks around a city, using a laser to light up targets which can be destroyed by the second player who pilots an attack helicopter.

A two player co-operative game, where the player with Oculus Rift and a rifle sneaks around a city, using a laser to light up targets which can be destroyed by the second player who pilots an attack helicopter.

Virtual Curling

Two player co-operative curling simulator where one player is a curler who "throws" stones, and the another player acts as a sweeper, affecting the trajectory of the stones while it slides on ice.

Two player co-operative curling simulator where one player is a curler who “throws” stones, and the another player acts as a sweeper, affecting the trajectory of the stones while it slides on ice.

Lazerzilla

The player's avatar is a giant cyber-lizard, who uses his claws and "laser breath" to destroy skyscrapers while fighting human soldiers, tanks, and helicopters.

The player’s avatar is a giant cyber-lizard, who uses his claws and “laser breath” to destroy skyscrapers while fighting human soldiers, tanks, and helicopters.

COVRSCPG

A co-operative two player game in the spirit of Super Monkey Ball and Marble Madness; one player is controlling a size-varying ball from a first-person Oculus Rift view, while the other player uses a god-view to help him advance in obstacle courses.

A co-operative two player game in the spirit of Super Monkey Ball and Marble Madness; one player is controlling a size-varying ball from a first-person Oculus Rift view, while the other player uses a god-view to help him advance in obstacle courses.

Runner

Two players compete on who can travel further on a snowy path filled with dangers.

Two players compete on who can travel further on a snowy path filled with dangers.

All you see above is created by students who have no or very little experience in creating VR applications. Five out of six applications featured two different display systems: Oculus Rift and two stereo 3D screens (for audience and/or second player).

In the course the students were free to create any kind of applications, and for some reason everyone chose to develop games 🙂

Posted in Teaching | Tagged , , , , , , | Leave a comment

Oculus Rift DK2 and Kinect 2 support added!

Head over to the download section to get the latest RUIS for Unity version!

kinect2dk2calibration

We have also added a process to calibrate the transformation matrix between several different sensor pairs (see above image). This enables to use Kinect 1, Kinect 2 (Win8 only), Oculus DK2, and PS Move in the same coordinate system even if the individual sensors have some distance between them or are oriented into different directions (the sensors’ view frustums need to partially overlap though), In other words, if you have calibrated Oculus Rift DK2 and Kinect 2, the Kinect 2 avatar’s head and body is correctly aligned with the head tracked position of Oculus Rift DK2 when you are using RUIS prefabs, and you will see your whole body in virtual reality!

There are still some issues that will be fixed for the next RUIS release. For example, Kinect 2 joint data is not smoothed yet, and the joints have a noticeable amount of jitter. The “Known issues” section in RUIS’ readme file lists a few more rough edges.

 

Posted in RUIS | Tagged , , , , | 7 Comments