“The Gallery EP1 Call of the Starseed ” Oculus Touch Update

Chris, the senior programmer on The Gallery has a blog post which discusses the update regarding the Oculus Touch inclusion.

When Denny roped me into working on a VR game with him before the Kickstarter in late 2012, he posited how the birth of affordable and practical consumer VR was on the immediate horizon. I was a bit skeptical. To me, the principle of “you wear a low resolution monitor on your face and it puts you in the game!” came across like a 3AM infomercial solution to a nonexistent problem. My initial reticence was a gut feeling that just controlling the rotation of the in-game camera with your head wasn’t enough. In order for VR to be more than just a flash-in-the-pan novelty, I felt that we needed to have control over the in-game hands too, and Denny strongly agreed.

So, before we even had access to the Oculus DK1, I went on Amazon and bought the studio’s first ever set of Razer Hydra motion controllers.

hydra-590x332

There wasn’t a Unity integration for it at the time, but I was able to roll my own integration after some research on the Sixense SDK that powered the Hydra. The first time I picked up an object and tossed it into the distance with my hands in VR, I knew that my gut feeling was correct: to have a proper first-person VR experience, you need the minimum level of presence offered by directly controlling both the camera and your avatar’s hands. This was something very few developers had tried before.

The Gallery’s very first hand system was extremely basic. I was using Unity’s built-in Mecanim inverse kinematics to make our full-body avatar’s arms and hands follow the Hydra controllers. The hands didn’t visually respond to button or trigger presses, but they were able to pick up and throw objects. It worked well enough as a proof of concept to add it as a first-order feature to the game.

denny-590x332

After the Kickstarter, it became a priority to polish the hand system, starting with introducing visual responses to controller inputs. We wanted the fingers to flex when the trigger was pressed, and we wanted the hands to conform to held items. We didn’t have a dedicated animator on staff, so I programmed the initial system to work around that issue. I cut the hands off of our full-body avatar and created new, separate hand objects which could be posed in Unity by the design team.

A set of static reference poses (open hands, closed hands, various item handholds) was created, and I coded a hand pose management system. This system blended on-screen hands to and from a pair of active poses selected from that set of static reference poses. For example, the empty hands consisted of an open and relaxed “from” pose, plus a closed-fist “to” pose, and the management system read the amount of pressure applied to the controller’s trigger. It blended the visible hand so that it was open and relaxed when no trigger pressure was detected, and closed into a fist when the trigger was fully depressed.

handz-590x332

That system underwent several minor iterations and improvements over the three-plus years leading to launch, and when Call of the Starseed was shipped, the hands were the second-oldest surviving subsystem after our controller input abstractions.

But, we knew it could be better, and we were waiting for a good opportunity to update and improve it.

That opportunity came after Call of the Starseed shipped and we were well into production of Heart of the Emberstone. We hired Steven Blomkamp as an animator earlier this year, and one of the major things on our hand system wishlist was to transition from the old, static pose blending system to one where the hands could be animated. We felt this would give them more life and polish, and that it would step presence up to a new level.

For these new hands, I redesigned the system almost from the ground up, taking into account the wishlist our designers had been maintaining, and working closely with Steven to get the most we could out of our respective software toolsets. All-new hands were modeled by Steven, and an all-new set of hand rigs were developed by Steven and myself. We migrated to a fully animated Mecanim state machine setup to control the visual behavior of the hands.

whatever-this-is-590x273

What that means is each hand state (empty, holding this item, holding that item, pointing at items, and so forth) is now capable of supporting subtle idle animations, and more complex state behaviors. Hands now respond naturally to nearby items and interactions with an anticipatory reach animation, where the fingers splay out as if preparing to grasp something. The new hands have a lot more life to them, and we love where the new interaction system may take us with future episodes.

Another exciting new option is that we’ve made the hands customizable with more inclusive avatar options. We now have both male and female hands, in both bare and gloved states, each with multiple skin colours to choose from. We used the Fitzpatrick scale as a starting point for deciding which skin colours to develop–the same scale used as the basis for Unicode emoji with selectable skin colors.

Developing these new hands gives us the potential to add more complex use-state animations for items like the flare gun and tape player, and the ability to add per-finger control to support finger tracking with future controllers. The overhaul also allowed us to improve item hand holds on several items by making them more responsive to your hands.

On bottles and similar large items, we’re now able to anticipate your incoming hands and predict where you will grasp an item, and adjust the hand hold position and orientation accordingly. This part of the system is still a work in progress, and will be used to fuller effect in future episodes. Steven and I are constantly discovering and testing things that we could do in future iterations, and I’m already planning refinements to extend the anticipation and prediction algorithms–from simple hand position and orientation changes, to including a more fluid level of adaptiveness at the finger level. I’m hoping that these efforts will bear fruit.

It was a challenging, delicate operation to replace the entire hand system, but it was more than worth it in the end! This is several steps closer to the system we’ve always wanted, and I can’t wait to see where it ends up a year from now.

The majority of the hand update is scheduled to go live with the release of Call of the Starseed on Oculus Touch, and will be added to the Steam version of Call of the Starseed in a content update near launch.

The Gallery Call of the Starseed Site