VR + Tobii Eye Tracking SOLUS Demo — Part three

VR + Tobii Eye Tracking SOLUS Demo — Part three

Part three of our eye tracking implementation in the SOLUS VR Demo.

Gaze Highlighting & Interaction

Another interesting property of the GTOM (Gaze-To-Object-Mapping) system I talked about in the paragraph about foveated depth of field is that it can allow for some interesting ways to change how the player interacts with objects in games. In desktop games, we are usually relegated to using a cross hair at the center of the screen that the player has to align to the object they want to pick up.

VR has made this a bit more intuitive at least, now the most common interaction is that you simply use the grab button when your virtual hand is close to the object you want.

In SOLUS though, since you spend so much time picking objects up in the game, and most of the stuff you want is on the ground, Teotl decided to be nice to the player and the game defaults to a selection ray, where you can point to an object with a world space ray, and if you hit it, you can interact with it. It can still be kind of tedious though to have to aim this ray at objects every time you want to pick something up, so in the demo we decided to leverage the GTOM system to allow you to select and pick up objects when looking at them up to a reasonable distance.

This provided some interesting challenges largely unique to SOLUS though. The first one was that you can inspect objects by targeting them with the interaction ray, and then reading about the object you have in your sights on your PDA.

Now, this becomes very tricky if you tie it to gaze highlighting/selecting since you would have to persist object selection from when you looked at the object you want to inspect, to the time your eyes land on the PDA. This would work in this scenario, but having object selection persist like this when you’re not interested in inspecting the object can feel very sluggish and non-responsive.

The other problem is that SOLUS is rife with world-space crafting. To smash open a can for example, you get a special crafting marker on the can you want to smash if you happen to hold the correct crafting item (such as a rock). Some items in the game have multiple such points, and the screen space real estate can get quite crowded with them. So crowded in fact, that it becomes a problem when it comes to eye tracking precision. Depending on the distance to the object, the area allotted for picking up the can, and the area for smashing it can become too small, making it very hard to determine what the user wants to do purely from eye tracking data, even with very advanced GTOM techniques.

For both these reasons, we decided it would be best to leave the interaction ray as it is, and instead use the eyetracker only to pick up items. Additionally, we made sure to let the interaction ray always override gaze based highlight and selection to minimize confusion.

It is always interesting to see how certain games’ individual mechanics and environment affect the way in which eye tracking needs to be crafted to fit well.

In the case of SOLUS, this is what we decided to do, while in another game, the correct solution might be something completely different.

For now though, I wish you my best regards,
I hope you enjoyed the demo and this small look into how we made it.

Fredrik Lindh,
Game Developer Tobii Eye Tracking

Read Part One

Read Part Two

 

Interested in integrating eye tracking in VR?

contacts-us-tobii-tech
Tobii Gaming

Written by

Tobii Gaming

Hi, we are a bunch of developers, QA Engineers, customer support, product managers, and more (mostly gamers) aiming to combine the wonder, competitiveness, and creativity of gaming with the technology of eye tracking. Over the past decade, we have built eye trackers to revolutionize the way we play, creating a vital tool for competitive gaming, and empowering a new generation of content creators, as well as their followers. We love to game, we love eye tracking.