- Tobii Gaming
- March 30, 2017 | 3 min
VR + Tobii Eye Tracking SOLUS Demo — Part two
Part two of our eye tracking implementation in the SOLUS VR Demo.
Something that is essential to the human experience is eye contact with other people.
Next time when you are talking to a friend, colleague or family, please try to avoid making eye contact throughout the entire conversation, just to see how they react. Or better yet, maintain eye contact all the time! As a finisher, try making eye contact only when it doesn’t make sense, or at random, for the additional luls.
As a disclaimer though, I take no responsibility for the ensuing awkwardness and possible repercussions that this behavior might inflict on your personal relationships.
It is a bit unfortunate then, that even though we all know that behaving in such a way in real life is something we should try to avoid, solving this problem for NPCs interacting with the player in a game well is simply not possible. Either you have to resort to characters that don’t engage in eye contact, or you have characters that have constant eye contact, neither being very attractive. A popular choice these days is to have characters establish eye contact at random. In games that have first person view cut scenes, directors even sometimes try to guess when the user might establish eye contact and just slap them in where they hope they will have the desired impact. But since none of these solutions consider the player’s attention, even when done really well (Elizabeth in Bioshock Infinite comes to mind), it is still not a 100% convincing — unless they have access to an eye tracker of course!
In the case of SOLUS, there really are no other NPC humanoids you meet though, so we chose to implement this features in some of the alien flora and artifacts that you encounter instead.
Looking at certain artificial objects like the camera drones at the start of the demo, provided you are close enough will alert you to their presence and will make them track you.
Teleport @ Gaze
As of late, it has become quite popular to solve the spatial issue with VR games using teleport mechanics. There are a couple of different varieties as some games use user defined teleport markers on the ground (SOLUS is one of them), others use fixed teleport locations or portals that you step through, while yet others opt to use more traditional controls.
Personally, for games that cannot work in a one-room environment (like SOLUS), I tend to prefer teleportation compared to the other options, but I have always felt that teleporting multiple times in a row in the same direction is quite tedious.
Games cannot really offer a good “teleport forward” button either since that would end up giving you no ability to course-adjust — unless they support eye tracking of course!
The math here is quite simple, although care has to be taken to not accidentally fall through the ground or be able to teleport to places you otherwise cannot teleport to. The way we decided to do it in SOLUS was to first deproject the gaze point to the world. If the angle was too steep (above the horizon plane), we then projected this direction vector to the horizon plane to avoid the player being able to teleport up cliffs, or other silly things like that.
We then did a linecast 5 meters in this direction, if we didn’t hit anything, we then did a second line cast along the world down basis axis to find ground and then picked that point. If we did hit something, we just chose that point right away. To avoid world clipping we then moved this point slightly back along the hit normal and chose that as our end target.