Top three most sought after technologies in a single headset

  • by Doug Eggert
  • 6 min

Tobii - XR industry survey 2019

Which components and functions do you believe are most important to develop, improve. and integrate for XR to become more widely adopted?

According to the 2019 XR Industry Survey, the top three components and functions that will drive XR adoption are 1) headset comfort and usability, 2) field of view, and 3) eye tracking. The design of HP’s latest VR headset, the HP Reverb G2 Omnicept Edition, delivers on all three. With 114-degree FoV, eye tracking, PPG sensor heart-rate monitoring, and a face camera to capture lip movements, I believe this headset is a clear representation of next-generation VR.

Delivered on an already fantastic headset — HP Reverb G2 — the Omnicept Edition is a commercial headset capable of capturing unbiased insights, enabling advanced VR applications in training, wellness, design, and collaboration.

As Tobii’s director of product management for XR OEM, the development project for this headset has been a great experience because it has allowed me to work closely with HP. My team’s objective was to integrate our Tobii VR4 Eye Tracking Platform, enhancing the headset with Tobii eye tracking technology and ensuring fusion with the other new sensors.

In this post, I share some of the thinking behind this exceptional new headset, why it’s relevant now, together with some of the thoughts and experiences from the people I collaborated with at HP — Henry Wang (product manager) and Lauren Domingo (software field engineer).

Generating unbiased opinions

One of the essentials for digital transformation and operational optimization (Industry 4.0) is meaningful real-time data. To fulfill the need for data, almost all things are equipped with sensors — discreet devices that quietly generate trillions of unbiased data points. Spearheaded by HP Reverb G2 Omnicept Edition, the coming generation of VR headsets will include sensors that create new inputs for application development.

Lauren and Henry, how do you see the world of sensors evolving? HP Reverb G2 Omnicept Edition sets a new standard, but how will headsets develop from here?

We believe in the benefits that sensor insights bring to VR. These types of sensor-based features will be standard on enterprise headsets. For example, eye tracking brings us data about pupillometry and dilation, and the PPG gives us heart rate. The face camera is kind of a first in VR. It allows us to capture the lower part of the face, which, together with the gaze data from eye tracking will help ISVs to develop realistic avatars.

Acquiring specialized skills

Tobii VR headsets - partners
Image courtesy of HP — learning to manage stressful situations

One of the side effects of Industry 4.0, with its productivity optimization and automation, is a constant shift in skills requirements and a persistent need for re-education. A study conducted by the University of Maryland in the US concluded that information-retention rates are higher for virtualized education than traditional computer-based learning. I can relate to this because the immersion and realism of some VR experiences make learning exciting and memorable. VR immersion allows me to focus on the task, and its authenticity helps me to gain a practical understanding of new concepts. By adding physical movement and range of motion into the mix, VR can help me to, for example, develop the dexterity required for new tasks.

When it comes to specialized skills, like surgical procedures or managing stressful situations, it seems that people who train with VR make fewer mistakes than people educated through traditional means.

Lauren and Henry, training is one of four key areas identified for the development of this headset. Can you talk about the feedback provided by sensor data and why it is compelling for training in VR?

Each sensor brings its own benefits to VR, but with the HP Reverb G2 Omnicept SDK it is possible to combine inputs at a higher level, so we get cognitive load output. Cognitive load is best leveraged by training applications, such as pilot training, or learning how to deal with emergency situations such as a gas leak. Cognitive load allows you to adapt training by, for example, adding more context to a situation, or allowing the user to back up and try something simpler. Providing developers with the data to ensure that employee training is successful has been one of our motivating drivers. Sensor feedback and eye tracking provide the data to gauge training performance, to assess whether a person has gained the required knowledge, what their reactions are, making the most out of every minute of training, maximizing productivity by putting the human back at the center. Not one size fits all. But personalized.

Harnessing user responses

Sensor fusion sets the HP Reverb G2 Omnicept Edition apart from all other headsets. Aggregated sensor data will help application developers personalize the VR experience by harnessing the user’s natural responses, removing the barriers and apprehension that some people associate with virtual and mixed reality.

HP Reverb G2 Omnicept Edition
Image courtesy of HP — HP Reverb G2 Omnicept Edition, expanded view

We may not always realize it, but we typically push ourselves beyond our capacity to process new information in unfamiliar situations. Through practice, we develop familiarity, and our cognitive load drops to manageable levels. VR provides a controlled space to practice in — a realistic and safe environment where we can repeat, learn, and minimize errors.

Lauren and Henry, can you give us some insight on why you chose to include eye tracking in this headset?

We were starting to see a surge in new use cases, particularly in the enterprise space, where eye tracking would be a necessary technology to be able to contextualize experiences. And honestly, our customers have been asking for it. Since we announced the HP Reverb G2 back in May, one of the most common questions we’ve been asked is if there’s going to be an eye tracking version? It’s great that we can now say yes to that question, and more!

XR — a catalyst for digital collaboration

The sudden shift to working from home has helped mitigate the economic impact of COVID-19. The timing of the HP Reverb H2 Omincept Edition is a testament to the human ability to collaborate under new circumstances. And the past couple of months has highlighted the need to modernize and enhance virtual collaboration. Estimates vary, but up to 40% of US jobs could be home-based. Post-pandemic, it seems likely that about 20–30% (up from about 5% pre-pandemic) of workers will continue to work-from-home.

This shift has created an opportunity for XR to deliver an advanced set of digital collaboration tools that incorporate facial expressions, body language, and tone-of-voice. Enhanced with the data to interpret muscle movements and visual cues for natural interaction, HP Reverb H2 Omnicept Edition puts VR collaboration on a new trajectory.

In a collaboration scenario, where people are working together on a design, say a car model or something, if you can see that you and your colleagues are all looking at the same thing, it makes the experience that much more natural. We rely on body language and eye contact, in particular, to help us to read situations, but reading someone’s face hasn’t really been possible in VR until now. Today we are enabling developers to make that first step toward emulating expression, potentially making VR the best way to collaborate outside of face-to-face.

Convenient development

Tobii HP Reverb G2 Omincept
Image courtesy of HP — assessing cognitive load

To get eye tracking and other sensors to communicate and generate unbiased raw, it’s not unusual for engineers to implement workarounds by manually combining existing technologies. HP Reverb G2 Omnicept Edition means they no longer need to do this. This headset makes it possible to get insights with a commercial device without such workarounds.

Lauren and Henry, can you tell us about how easy it is to start developing applications that leverage HP Reverb G2 Omnicept Edition headset functionality?

Feedback from our early access developers shows that it takes about 20 minutes to set up and start getting data back from the platform. Of course, it is going to take a little bit longer for developers to create new applications. So, we have put together a knowledge base, blog posts, and development portal to help developers understand what they can do with cognitive load, but we also want them to be creative. We are excited to see what developers are going to do with our new headset and the insights it offers.

Dynamic foveated rendering

Right now, we are benchmarking Tobii Spotlight Technology for Foveated Rendering on the HP Reverb G2 Omnicept Edition. The initial trials show remarkable results, with average savings on shading load approaching 66%¹. This level of savings presents developers with an opportunity to raise visual clarity and application performance where it is needed most.

This benchmark also underpins our assertion on the close coupling between the benefit of foveated rendering and rising screen resolution.

Lauren and Henry, what’s your take on our foveated rendering technology?

Well PC-based VR is still the best place to get the most immersive VR experience, at full fidelity, with all the textures and with the sort of polygon counts you’d want. We can now take that experience and make it even better with foveated rendering. And, of course, eye tracking and Tobii Spotlight Technology are the enablers here. It allows us to free up resources on the user’s GPU, and developers can then take that and apply it where it matters the most. We’re going to have a plugin in our SDK to help train our developers to use foveated rendering.

While this post is not a review of the HP Reverb G2 Omnicept Edition, I didn’t think it would be complete without a few facts. In terms of form, this headset is the same core components as the HP Reverb G2, with the same ultrasharp display and industry-leading lenses by Valve. What sets this headset apart is its sensor fusion, which aggregates raw data from the integrated Tobii VR4 Eye Tracking Platform, with PPG and face camera sensors to deliver cognitive load analysis.

Final comments from Lauren and Henry:

Next gen XR is all about adding more to the VR experience, above and beyond immersion. We are moving into two-way interaction, adaptive experiences, and giving feedback to the enterprise, providing insight about what’s happening during the experience. And we are setting a precedent in how you do this with security and privacy in mind.

Find out more about Tobii eye tracking and the HP Reverb G2 Omnicept Edition.

Written by

  • Tobii Doug Eggert

    Doug Eggert

    VP of XR, Tobii

    In my role, I get to work directly with headset manufacturers, helping them integrate eye tracking into their hardware. My focus is the introduction of eye tracking for effortless interaction and immersion in virtual and mixed reality as well as enabling more capable devices with solutions such as foveated rendering and analytics. Personally, I am excited about the future of spatial computing, which helps me greatly in my role because I am passionate about working closely with our customers and engineering team to drive the widespread adoption of eye tracking in XR.

In collaboration with

  • Lauren Domingo - HP employee

    Lauren Domingo

    field software engineer, HP

    Lauren is a HP field software engineer supporting XR software developers based in Ft. Collins, Colorado. You can find her at hp.io/xr or hp.io/omnicept — the HP XR and HP Omnicept developer portals that host the Omnicept SDK, forum, XR blog, documentation, and more.

  • HP employee Henry Wang

    Henry Wang

    product manager in the XR group, HP

    Henry is a product manager in the XR group at HP based in Fort Collins, Colorado. He can be found replaying through levels of Half-Life: Alyx or discovering the latest innovative enterprise use case of VR in his HP Reverb G2 headset.

Related content

Subscribe to our blog

Subscribe to our stories about how people are using eye tracking and attention computing.