Category Archives: vr

Installation Instructions for Leap Avatar Hands for Leap Motion Orion

Installation

  1. Open your Unity project.
  2. If you have previously installed the old Leap Motion V2 skeletal tracking asset, ensure it is COMPLETELY REMOVED. This includes the various Leap motion DLLs etc.
  3. Ensure that the Orion version of the Leap Motion tracking software is installed on your computer https://developer.leapmotion.com/orion
  4. Ensure the official Leap Motion Orion Unity assets package is added into your Unity project https://developer.leapmotion.com/unity
  5. NOTE: The current version of this asset (1.21) is not compatible with the latest Leap Motion Orion Core Assets (4.1.0), please use the older 4.0.2 version of the core assets for the moment.
  6. The pending version on the asset store (1.3) REQUIRES Leap Motion Orion Core Assets 4.1.0.
  7. Download and import the Avatar Hand Controller for Leap Motion from the asset store.
  8. That’s it! The package will be installed into a LeapAvatarHands folder. You can check out a demo scene in the Scenes sub-folder.

Configuring your own avatar

Adding this solution onto a player avatar in your Unity scene is quite easy, although does require a fair bit of configuration for the avatar.

  1. 1. Set the update mode on the Animator component of your character to “Animate Physics”.
  2. For desk mode: Add a child gameobject to your player gameobject (GameObject->Create Empty Child). The player gameobject must be a properly rigged character with 5 fingered and 3+ jointed hands. I did all my testing with characters generated by Mixamo Fuse, and then rigged with the Mixamo auto-rigger.
  3. For VR mode: Add a child gameobject to your camera gameobject (GameObject->Create Empty Child).
  4. For desk mode: Position this child gameobject where you want the virtual Leap controller to sit, generally speaking you will want this to be about 20-40cm out from the avatar’s belly button.
  5. For VR mode: Position this child gameobject 0.1 z-units forwards of the Camera gameobject, so it sits a little ways in front of the avatar’s face. Then ROTATE the object -90 on the x axis and 180 on the y axis, so that it is facing forwards in the same way that the leap motion itself would face forwards when mounted on a VR headset.
  6. Add the IKOrionLeapHandController script and the LeapProvider script to this empty gameobject.
  7. Go to the avatar’s left hand and add the RiggedHand script from the Leap Unity package to it.
  8. Set the handedness appropriately.
  9. Add the AutoRigHand component to the hand and use it to automatically set up the hand. You will still need to set finger pointing and palm facing parameters appropriate for your particular avatar, although Mixamo rigged avatars should work with the default settings.
  10. You will also need to set the finger pointing and palm facing vectors for each hand and finger. For reference with my Mixamo characters I had the finger pointing as 0,1,0, and the palm facing as 0,0,0. For each finger I had 0,1,0 and 0,0,0 except the thumb which was 0,1,0 and either -1,0,0 for the left hand or 1,0,0 for the right hand. However, these settings will depend on how the character is setup.
  11. Do steps 7-10 for the right hand also.
  12. Go back to the IKOrionLeapHandController gameobject and set the Avatar Left Hand and Avatar Right Hand fields to use the hands you just configured.
  13. Go to the root node of the avatar and add the IKActioner script, and tell it which gameobject you put the IKLeapController onto.

The IKActioner script passes the inverse kinematics pass from the animator component on the avatar up to the actual IKLeapHandController.

The IKLeapHandController is an extension of the provided LeapHandController, set up to work with avatar hands and inverse kinematics.

Using rigid hands

So that your hands can interact with physical objects in game, I have included a RigidIKRoundHand_L and RigidIKRoundHand_R prefab which will work with my IK solution.
For desk mode simply add these as children of the IKOrionLeapHandController gameobject and update the physics left hand and physics right hand references in the IKOrionLeapHandController script.
For VR mode place these two gameobjects in root transform (straight into the scene, not as a child node of another transform). Then update the left physics hand and right physics hand references in the IKOrionLeapHandController script to point to them.
NOTE: Physics hands are a little inaccurate still, but can be improved by fine adjustments to the offset and wrist offset parameters in the RigidHands and RiggedHands components. The avatar hands are limited by the range of movement of the avatar’s hands, whereas the physics hands are not. Consequently, it is difficult to get the physics hands and the real hands to sync up perfectly. The offset and palm width parameters of the RigidIKHands can be tweaked to improve accuracy, but it’s still not perfect, particularly when your arms are at full stretch.

Making your own Animator

It is important to note the two different layers I used in my example Animator, with IK Pass enabled on the base layer and blending set to override with a weight of 1.
Also, don’t forget to set the Animator to Animate Physics! This tweaks the order of execution, allowing the Leap data to override the animation data.
If you make your own Animator and it’s not working right, refer to the example to see how it’s meant to be done.

Using Hand Position Data at Runtime

Unity has a very particular order of execution. If you attempt to reference the hand transform in the Update method you will find that the data is incorrect, it will be based on where the currently playing character animation thinks the hand should be, rather than the leap motion hand data. The Animator IK pass happens after this.
As such, if you are trying to use the hand transform data at runtime, you should probably be doing that in LateUpdate(), after the Leap data and IK pass has occurred.

Leap Motion Avatar Hand Installation Instructions for V2 Skeletal Tracking System

Installation

  1. Open your Unity project.
  2. First check that you have installed the official Leap motion V2 skeletal tracking asset, available free on the asset store. Optionally you may also want the Leap Motion OVR assets, available from the Leap motion developer website.
  3. Download and import the Avatar Hand Controller for Leap Motion from the asset store.
  4. That’s it! The package will be installed into a LeapAvatarHands folder. You can check out a demo scene in the Scenes sub-folder.
What you should see in your project assets folder if everything has been installed.
What you should see in your project assets folder if everything has been installed.

Configuring your own avatar

Adding this solution onto a player avatar in your Unity scene is quite easy, although does require a little bit of configuration for the avatar.

  1. Set the update mode on the Animator component of your character to “Animate Physics”.
  2. Add a child gameobject to your player gameobject. The player gameobject must be a properly rigged character with 5 fingered and 3+ jointed hands. I did all my testing with characters generated by Mixamo Fuse, and then rigged with the Mixamo auto-rigger.
  3. Position this child gameobject where you want the virtual Leap controller to sit, generally speaking you will want this to be about 10-30cm out from the avatar’s belly button.
  4. Add the IKLeapHandController script to this empty gameobject.
  5. Go to the avatar’s left hand and add the RiggedHand script from the Leap Unity package (v2.0+) to it.
  6. Go to each of the avatar’s fingers and add the RiggedFinger script to them.
  7. Set the finger type on each of the fingers.
  8. Set the bones (transforms) on each of the fingers, you should have 3, with the first one being the transform that represents the base of that finger, the second one being the knuckle, and the third one being the joint between the knuckle and fingernail.
  9. Go back to the avatar’s hand, and set each finger in the Fingers array: Thumb, Index, Middle, Ring, Pinky.
  10. You will also need to set the finger pointing and palm facing vectors for each hand and finger. For reference with my Mixamo characters I had the finger pointing as 0,1,0, and the palm facing as 0,0,0. For each finger I had 0,1,0 and 0,0,0 except the thumb which was 0,1,0 and either -1,0,0 for the left hand or 1,0,0 for the right hand. However, these settings will depend on how the character is setup.
  11. Do steps 4-9 for the right hand also.
  12. Go back to the IKLeapHandController gameobject and set the Avatar Left Hand and Avatar Right Hand fields to use the hands you just configured.
  13. Go to the root node of the avatar and add the IKActioner script, and tell it which gameobject you put the IKLeapController onto.
Leap Hands - Connected to your avatar.
Leap Hands – Connected to your avatar.

The IKActioner script passes the inverse kinematics pass from the animator component on the avatar up to the actual IKLeapHandController.

The IKLeapHandController is an extension of the provided LeapHandController, set up to work with avatar hands and inverse kinematics.

Using rigid hands

So that your hands can interact with physical objects in game, I have included a RigidIKHands prefab which will work with my IK solution. Simply add these into the physics hands slots on the IKLeapHandController component.

How to use rigid hands with the IK Leap Hand Controller.
How to use rigid hands with the IK Leap Hand Controller.

Using Leap Avatar Hands with Oculus Rift (OLD)

Note: This set of instructions relates to the old Oculus OVR package, and does not directly work with the new Unity Native VR feature.

leapovr_avatar_hands
How it should look when you’ve got it right.

Using my Leap Avatar Hands controller with the Oculus Rift DK2 is quite easy. Leap have provided examples of how to use the Leap motion with the Oculus Rift as part of their LeapMotion+OVR package. Using my controller works the same way as theirs.

Essentially you need to do only a few things:

  1. Add an Oculus OVR Camera Rig from the Oculus provided Unity 4 Integration package onto your avatar, just in front of their eyes.
  2. Drag the LeapHandController GameObject I provided onto the Oculus Camera Rig’s CenterEyeAnchor transform.
  3. Set its position to 0,0,0
  4. Set its rotation to 270,180,0 (this mimics the physical orientation that the Leap Motion device will have when it’s attached to the front of your Oculus Rift headset).
  5. Set the “Is Head Mounted” flag to true in the IKLeap Hand Controller component.
  6. Move the LeapHandController object along the z-axis so that it is a little ways in front of the camera rig object. Tweak this location to suit.

When you’re done it’ll look like the screenshot provided above.

Leap Avatar Hands in action with the Oculus Rift.
Leap Avatar Hands in action with the Oculus Rift.

 

Failed Experimental First Person Control Scheme for VR

It’s a well established problem at this point. First person control schemes for VR, specifically the Oculus Rift, just aren’t very good. There’s something very unsettling about being turned around without having the associated sensation in your real body. Stick yaw (rotation on the Y-axis) is particularly awful… that means every time you move your mouse, or hit the camera stick on your game-pad. Nausea town.

However, I noticed that it’s relatively comfortable to do roller coaster experiences in the Rift. It’s not too horrible even when you turn corners. The seated “on-rails” experience is relatively manageable, compared to a simulated walking experience.

This gave me a thought: can we just make walking in VR an on-rails experience too?

So I spent an afternoon coding up a system in Unity where you select a waypoint from a list, and your character will automatically walk to that waypoint for you. This was extremely uncomfortable at first, as you weren’t really too sure what route the NavMeshAgent was going to pick to get to its destination.

An automagical navigation system for first person character movement.
An automagical navigation system for first person character movement.

I then added a sparkly particle trail, so you can see where the character is going to try to walk. More equivalent to an on-rails experience.

A sparkly breadcrumb trail of pure first person character VR joy?
A sparkly breadcrumb trail of pure first person character VR joy?

This worked just great in every way except the goal… it made you sick as all hell when your character turned corners.

I experimented with the cornering speed. Tried making the character turn corners really slowly… this actually made it way worse. Fast track to chunder-town.

To make matters even worse, when you turned corners really slowly it made it easy for the character to miss the destination waypoint and start orbiting it, trying to make their way to the destination but constantly missing it. This orbiting behaviour was an incredible puke-fest.

Turning the cornering speed way up helped quite a bit. It was still very uncomfortable though – basically it just got it over and done with faster, so it was less sickening. But it was still sickening.

The lack of active input means that even though you can see the path ahead of you, your body just isn’t really 100% prepared for what happens. It would likely be better if you put the character into a vehicle for this navigation event. The human brain is relatively accepting of being moved by a vehicle. But then it’s not really a first person character control scheme, is it?

Disheartened by this complete and utter fail, I just implemented a teleportation system. Screen fades out, character is teleported to the waypoint, screen fades in. Easy peasy. Quick job.

Teleportation is best.
Teleportation is best.

Turns out teleportation is completely comfortable.

So often the easiest solution is the best solution.

If you want to try either of these solutions out, they’re both included in the latest release of the Music Visualisation Experience.

Oculus Best Practices Guide

Having had further opportunity to reflect on this experience, I thought I’d post an update. Everything is “discovered” here has already been discovered by those who have gone before me. When I first got the DK2 I quickly glanced at the Oculus Best Practices Guide, but didn’t really assimilate it fully. I was in a hurry!

A few months of developing for the Rift under my belt and I took a moment to give it a more thorough reading. They summarise the problem fairly well: acceleration is to be avoided. Your vestibular system doesn’t detect acceleration, but your eyes are telling you there is some. There’s a disconnect between those two systems that causes the discomfort. When you turn you experience acceleration, thus the problems with turning.

The guide encourages movement being done at a constant velocity where possible, avoiding acceleration events. It also points out that head-bob is a series of small accelerations in various directions, and recommends against head-bob. Teleportation is also suggested as a viable option.

I guess I should make a head-bob free version of my first person controller that walks at a constant velocity in straight lines everywhere!

Economic hurdles for Virtual Reality gaming

I think virtual reality, in particular via the Oculus Rift, is exciting. I think it allows us to experience existing things in a completely new way, and even better, it allows us to experience things that have never before been possible. It’s the market for these new, unique to VR, experiences that I’m really focusing on in this post.

However, we’ve got a problem that needs solving. There’s currently no real way to make any money by independently developing specifically for VR. Sure, you can tack on VR support to your conventional game… but without that mainstream audience, your market is just too small to be viable.

Consequently, most of the development work that is happening specifically for Oculus Rift is being done by small scale independent developers, hobbyists, and enthusiasts. What’s more, because they can’t make any money out of it, they’re doing it in their spare time. This really limits the scope, scale, and quality of offerings.

Don’t get me wrong, there’s a bunch of little guys doing great things and for the right reasons. This is great. Power to the people. Democratize game development. It’s all good.

But if we really want people to make compelling, fully featured, high quality Oculus Rift games, then there needs to be a way to make genuine amounts of money doing it.

Here’s the problem as I see it. We have an existing gaming community that already vastly underestimates how much time/energy/skill/effort/money it takes to create games.

People see games getting made with $30,000 kickstarter targets every day. They wrongly inference that $30,000 is enough money to make a game – when in reality the kickstarter campaign alone cost way more than $30,000 to put together, and doesn’t even start to cover the costs of development. Kickstarter is primarily a marketing and pre-ordering tool, not a funding tool. Even huuuuge kickstarter successes like Star Citizen and Elite: Dangerous only got a small fraction of their total cost of development covered by their kickstarter millions.

People see indy games selling on steam for $2.50. Even major titles routinely drop under $10. They are used to paying these kinds of sums for games. $100 price tags for games are long gone.

The problem with this existing perception of the value of games is that it doesn’t hold true for games that are made for VR. Normal computer games and phone games have a target audience of potentially hundreds of millions. The VR market is a tiny, tiny, tiny fraction of this.

Currently, including DK1 and DK2 there’s somewhere between 100,000 and 200,000 VR units floating around the place. Now, obviously, these are development kits not targeted at a consumer audience, and we can safely assume that once the consumer version hits there will be many times more than that available… but, perhaps not so many as you might think.

The CPU and GPU requirements for the DK2 are, at present, already prohibitive. You need a top end computer to experience anything more than an upset stomach.

The consumer version is expected to be significantly higher resolution, and run at a higher refresh rate. This means that the CPU and GPU requirements will be even higher. If the efficiency of Oculus Rift display drivers doesn’t significantly improve (there is some potential that this might happen, hopefully sooner rather than later), then only a tiny percentage of the overall PC gaming market has a computer that will be able to deliver a good Oculus Rift experience.

When the market is so small you cannot count on shifting a lot of units to make your money. You need to price your product high if you’re going to recover the costs of development. But the community is too used to bargain bin prices. Games are cheaper than coffee now. Does anyone honestly expect people to be paying $100-200 for an Oculus Rift game?

It's a pity.
It’s a pity.

If we assume the answer is no, then we’re left with the status quo. Hobbyists and enthusiasts putting together extremely small and functionally limited demos in their spare time. Personally, I think these experiences are unlikely to be good enough to really drive the revolution we’re hoping for.

Fortunately, there is still plenty of hope for the Oculus Rift. There are plenty of mainstream games that have a sizeable standard monitor audience who can relatively cheaply and easily sneak in Oculus Rift support, for the tiny market that wants it. There’s also a market in video content, simulations, education, etc. These applications will ensure that the Oculus Rift has a healthy market and the opportunity to develop and flourish…  but what I’m really excited about is those experiences that are unique and special to VR. Tailored for it. Refined, polished, immersive, “only for VR” experiences. Perhaps by the time CV2 or CV3 comes out there’ll be enough units on the street to seriously consider making a living by working in that space.