improving

Thứ Năm, 1 tháng 10, 2015

Unity 5.2.1: Three quick tips for improving the VR user experience

For demos that I work on, there are three simple things that I always do to create a better user experience:
  • make sure the Oculus health and safety warning (HSW) has been dismissed before any other information is displayed
  • give the user the option to re-center their avatar after they’ve put the headset on and settled in a comfortable position
  • encourage the user to create a profile

If you are a regular reader of this blog, you know this isn't the first time I've covered these topics. However, now that Unity includes native VR support I wanted to revisit these tips and show how to do them with Unity 5.2.1, the 0.7 runtime, and the Oculus Utilities for Unity 0.1-beta package.

Knowing when the HSW has been dismissed

The HSW is a big rectangle centered in front of the user. While it is semi-transparent, it blocks the user’s view significantly. For that reason, I like to make sure that is has been dismissed before displaying anything the user needs to interact with.

The Oculus Utilities for Unity 0.1-beta package provides a way to check to see if the HSW is still displayed:

OVRManager.isHSWDisplayed

This returns true when the HSW is displayed and false when it is not. Note that if you are running the application in the Editor, you won't see the HSW in the Editor window, however, the HSW will appear in the Rift view.

Re-centering the avatar

It is helpful to give users the option to re-center their virtual selves and this is one of the functions available as part of the Unity native VR implementation. 

 To re-center, add the VR name space and use:


Encouraging the user to create a profile

If the user has not created a profile, the user may experience discomfort because the default IPD or height is different from their own. In previous versions,  you could get the name of the profile in use, and if the name returned was "default," you would know that the user had not created a profile. Unfortunately, I’m not seeing a way to do that with the Oculus Utilities for Unity 0.1-beta package. I tried  using OVRManager.profile.username, but it returns "Oculus User" and not the name from the user profile. Looking at OVRProfile, it appears that only ipd, eyeHeight, and eyeDepth are retrieved from the current user profile (Note: you can get the user's eye height and IPD  using .)

For now, I’ll just stress the importance of creating a profile in the documentation. If anyone knows how to get the current user name, please comment!

Thứ Sáu, 11 tháng 9, 2015

Quick Look: Using Unity 5.2 native VR support to create a scene for the Rift

Unity 5.2 came out before I had time to write a 5.1 native VR post. With the book out,  I can now get back to writing blog posts. Here's what happened when I took a look at using Unity 5.2 native VR support to create scenes for the Rift.

I downloaded Unity 5.2 and gave it a quick trial run with the DK2, the Oculus Runtime 0.7.0-beta, and the Oculus Utilities for Unity 5 0.1.0-beta package. As you might have guessed from the runtime version, this test was Windows only. To test it all out, I built a quick sample scene using assets found in the Unity standard asset packages, enabled native VR to get the scene on the Rift, added a player character, and then did a test build and run.

Enabling VR and getting the scene on the Rift

I used the same sample scene I’ve used in all my previous tests, a simple beach scene with a few palm trees created using assets from the Unity standard asset Environment package along with a single camera located at 0, 1, 0.


To run the scene on the Rift, I selected Edit > Project Settings > Player > Other Settings and made sure that Virtual Reality Supported was checked.


That done, with the Rift plugged in, I pressed Play in the Editor and was able to see the scene both in the Editor window and on the Rift (with the 0.7 runtime and Unity 5.2, direct mode works in the Editor).  As I moved the headset around, I could see that head tracking was enabled and with the Rift on, I could look anywhere I wanted in the scene. Two differences to note between the Editor view and the Rift view: first, the Editor preview is monoscopic and does not include lens correction, and second,  the Oculus Health and Safety warning is visible in the Rift view, but not visible in the Editor window.

With only a camera in the scene and no scripts for navigation, I couldn’t move anywhere. So, my next step was to add a player character.

Adding a player character to navigate the scene

The first option for quickly adding a player character I looked at was to use the first-person player character prefab  from the Unity standard assets Characters package. After downloading this package, I  simply dragged the FPSController prefab onto the scene and pressed play. This worked, but even though I've been using VR for a while and have my "VR Legs," I still found the default speeds for this controller fairly uncomfortable. 

After a break, I took a look at a second option: using the first-person player character prefab (OVRPlayerController) found in the  Oculus Utilities for Unity package. This package contains a variety of scripts and utilities that can be used to improve the VR experience (which I’ll be looking at in future posts) along with the first-person player character prefab.

The OVRPlayerController prefab from the Oculus Utilities for Unity package is a basic character controller that uses A, S, D, W controls.  The default speeds were slower than the first option I tried and, for me, were much more comfortable to use. In addition, like versions of this prefab from the older integration package, it includes a menu of diagnostic information you can view by pressing the space bar. 

Using this menu, I saw that I was easily getting 75 FPS for my sample scene.  A good start.  One issue I noticed though, was that while running the scene with the player prefab from the Oculus Utilities for Unity package, the console would sometimes be spammed with following error:

Quaternion To Matrix conversion failed because input Quaternion is invalid {0.000000, 0.000000, 0.000000, 0.000000} l=0.000000
UnityEngine.Matrix4x4:TRS(Vector3, Quaternion, Vector3)
OVRTrackerBounds:Update() (at Assets/OVR/Scripts/Util/OVRTrackerBounds.cs:151)

Looking on the forums, this does not appear to be a serious issue, and word is it should be fixed in an upcoming release.

Now that I had a VR scene that I could navigate, the next step was to do a test build and run.

Building and running the test application

In previous versions, building a VR application resulted in two executables: <application>_DirectToRift.exe and <application>.exe.  Now there is just one: <application>.exe.

I first tried just selecting File > Build Settings > Build and Run but this didn't work as the application failed to run. I then built the application and ran the executable by clicking on it and that worked fine.

Thứ Tư, 19 tháng 8, 2015

Code Liberation Foundation: Unity-3D UI with Oculus Rift (Lecture + Workshop) - Sept 5

I am excited to announce that I'll be teaching a class on Unity-3D UI and the Oculus Rift on September 5th for the Code Liberation Foundation.

The Code Liberation Foundation offers free/low cost development workshops in order to facilitate the creation of video game titles by women.  Code Liberation events are trans-inclusive and women-only. Women of all skill levels and walks of life are invited to attend.

If you are interested in VR, identify as a woman, and are in the NY area, I'd love to see you there. Space is limited, so sign up today!

Thứ Ba, 18 tháng 8, 2015

Thứ Tư, 29 tháng 7, 2015

Working with the Rift is changing how I dream

The first video game influenced dream I remember having was back in the late eighties. I was obsessed with Tetris and had very vivid dreams of Tetris blocks falling on me. So, it isn’t surprising to me that playing video games can change the way you dream. That said, the specific effect that working with the Rift has had on my dreams did surprise me.

When using the Rift, you are sitting in place and the the world moves around or past you instead of like real-life where you move though the world. I find now in my dreams, no matter what I am dreaming about, there are now two kinds of movement - movement where I dream I am moving through a world and movement where I am still and the world moves around or past me. Perhaps, this kind of dreaming is an attempt by my brain to make Rift movement feel more natural to me? Anyone else dreaming like this?

Thứ Ba, 23 tháng 6, 2015

Unity + Leap: Explicit instruction and hand gestures

I have been experimenting a bit with the LEAP and looking at getting objects into the user’s hand. In one of my experiments*, the user holds their hand out flat and a fairy appears. This experiment used explicit instruction written on menus to tell the user what to do.




Explicit instruction worked in that my test users did what I wanted them to - nod and hold their hand out flat. The downside, of course, is that it required them to read instructions which isn’t very immersive or fun. In future experiments, I want to look at implicit instruction, such as having non-player characters perform actions first.

* This demo is now available from the  Leap Motion Gallery.

Notes on getting an object to appear on a user’s hand

Some quick notes on getting the fairy to appear on the user’s hand:

You can find all of the hand models in a scene using:

HandModel[] userHands = handcontroller.GetAllPhysicsHands();

To know if the hand is palm up, you can get the normal vector projecting from the hand relative to the controller, using:

userHands[0].GetPalmNormal()

To know if the hand is open or closed, you can look at the hand’s grab strength. The strength is zero for an open hand, and blends to 1.0 when a grabbing hand pose is recognized.

userHands[0].GetLeapHand().GrabStrength

To know where to place the object, you can get the palm position (relative to the controller), using:

userHands[0].GetPalmPosition()