Latest Eye Tracking Developments in AR, VR and MR

Latest Eye Tracking Developments in AR, VR and MR

Introducing Our New Blog Series for XR Devs 

With exciting leaps in both hardware and software happening at such a rapid pace right now, we know plenty of you devs are keen to get ahead of the curve in understanding how eye tracking can help to enhance virtual interactions, provide innovative new metaverse experiences, and empower users with even greater levels of immersion than ever before. 

This blog is the first in a new series written for XR software and game developers, and today we will be bringing you up to speed with some of the latest awesome projects we’re proud to see our technology helping. Read on to find out about our eye tracking for avatars, learn about the first XR game to showcase eye tracking as core functionality, as well as some fascinating event presentations ranging from next-gen game development to pilot training in the enterprise world. 

We’ll be talking more and more about developing for AR, VR, and MR devices over the coming months, so be sure to subscribe to our mailing list over on our XR Developers page and you’ll be the first to hear each time. 

Avatars in the Metaverse and Beyond 

From World of Warcraft characters to Second Life alter-egos, the digital avatar has long been a way for online communicators and especially gamers to manifest themselves online in any way they desire. To create an avatar is to express yourself and how you want to be seen, whether that be through replicating how you look in the physical world or designing all sorts of creative new looks, from the whimsical to those expressing how you feel on the inside. 

As we enter the realms of extended and virtual reality, the importance of avatars will reach even greater heights, as we will more and more find ourselves face to face with the virtual avatars of others, whether in next-gen AAA games for VR headsets, or innovative new tools for communication and collaboration among colleagues. 

Eye tracking adds an extra layer of reality to avatars and virtual characters; eye contact and eye movements both play a hugely significant role in how we communicate with each other, and by enabling them in XR we extend these essential aspects of humanity online. Allowing users to adopt more true-to-life expressions and appear more authentically present adds further to the incredible immersion possible in the metaverse. 

Tobii x LIV x Ready Player Me 

In March, we announced an exciting partnership with LIV, the leader in XR game streaming, and Ready Player Me, an avatar platform which lets users create 3D avatars for use in hundreds of different XR apps and games. "Designing avatars is not just about creating beautiful faces, bodies, and peripherals, but also about reflecting and communicating real-time attention and emotions of a user with others" said Timmu Tõke, Co-Founder and CEO of Ready Player Me at the time – and we couldn’t agree more. 

The first experiment as part of this collaboration features a game called Racket: Nx, to provide an early demonstration of how real-time eye movement in a streamer’s avatar can provide viewers with insights into their gameplay. Check out the video below.

 

Experimenting with MetaHuman 

MetaHuman Creator is a tool for creating hyper-realistic characters in Unreal Engine, and to explore just how well eye tracking adds to that realism, Ajinkya Waghulde – one of our senior engineering managers  here at Tobii – blogged about integrating our eye tracking into MetaHuman avatars, with an impressive video example you can see here: Tobii MetaHuman avatar demo. 

Take a look at how Ajinkya did it in that blog post, and then why not dive straight into our Tobii XR Devzone’s Social Use Cases area to learn more? 

gif-char3-4-logo-final (1)

 

Eye Tracking at Recent Events 

Starblazer, one of the first VR games to prominently feature eye tracking 

Last week I was at AWE USA 2022 in sunny California with our team, meeting partners, devs, and other XR industry leaders. One of the highlights for me was the presentation I got to do with our good friends at Starcade Arcade who talked about their exciting game Starblazer which will see Tobii eye tracking arriving in this summer’s Starblazer: Factions. 

Alexander Clark, Starcade Arcade’s founder, talks through three key eye tracking features that are part of Starblazer’s upcoming refresh, featuring attentive & reactive UI, natural interactions with objects, and more. He also shares helpful lessons they have learned during development, to give you a head start when it comes to developing your games and apps.  

I’d recommend not closing the video at the end of the presentation, or you’ll miss Alexander’s answers to the fantastic audience during the Q&A at the end. Watch the video here.

Unity’s GDC Presentation on Building Games for PSVR2 

This video isn't quite as hot off the presses, but is no less interesting! At GDC 2022 in March, Unity gave a fascinating presentation about building next-gen games for the much-hyped PlayStation VR2. 

The whole presentation is well worth a watch, but you won’t be surprised to know that what caught our attention most was when Jono Forbes, Senior Engineering Manager in the XR Team at Unity, said “I’ve saved what is, to me, the most exciting new design space for last, which is eye tracking of course”. 

Always great to see we’re not the only ones excited about eye tracking, and Jono does a great job talking about how Unity sees eye tracking being a key part of the coming generation of VR headsets. We’ve embedded the video below to start at the eye tracking section, 31m22s into the video, but feel free to drag back to the start to catch the full presentation. 

 

Enterprise x Eye Tracking: Pilot Training  

Much of this blog has focussed on the gaming side of XR but eye tracking also has plenty to offer elsewhere, including in the enterprise sector. 

Another highlight from AWE was the presentation given by Rick Parker, CTO of Visionary Training Resources (VTR),. Rick tells about how VTR is disrupting the early stages of pilot training, to help fill a worldwide shortage of pilots, and how Tobii eye tracking has enabled performance tracking, and demonstrated ROI, to help make their business case to major airlines. You can find this Tobii x Visionary Training Resources video here.

 

The Tobii XR Devzone 

As you may already know, we have an entire Devzone dedicated to augmented reality, mixed reality and virtual reality eye tracking integrations – featuring documentation, guides, demos, and more. Check it out here: The Tobii XR Devzone. 

We will shortly be launching a survey to gather feedback and suggestions from any devs using or planning to use the XR Devzone, so don’t forget to register for our newsletter if you haven’t already or check back soon for a link on the Devzone itself.  

Johan Bouvin

Written by

Johan Bouvin

Hi, I am the director of product — XR software and ecosystem at Tobii, which means I work with developers globally to understand how their eye tracking needs are evolving, ensuring that we continue to develop our SDKs, developer tools, and XR platform to align with the rapidly expanding needs of the industry. I get to collaborate with great people in different lines of work. The more varied the individuals, the more interesting the results.