Monday, August 21, 2017

Sunday, June 11, 2017

How does eye tracking work?

Eye tracking could become a standard peripheral in VR/AR headsets. Tracking gaze direction can deliver many benefits. Foveated rendering, for instance, optimizes GPU resources by using eye tracking data. Higher-resolution images at shown at the central vision area and lower-resolution outside it. Understanding gaze direction can lead to more natural interaction. Additionally, People with certain disabilities can use their eyes instead of their hands. Eye tracking can detect concussions in athletes and can even help people see better. Eye tracking can help advertisers understand what interests customers.

Eye tracking is complex. Scientists and vendors have spent many year perfecting algorithms and techniques.

But how does it work? Let's look at a high-level overview.

Most eye tracking systems use a camera pointing at the eye and infrared (IR) light. IR illuminates the eye and a camera sensitive to IR analyzes the reflections. The wavelength of the light is often 850 nanometers. It is just outside the visible spectrum of 390 to 700 nanometers. The eye can't detect the illumination but the camera can.

We see the world when our retinal detects light entering through the pupil. IR light also enters the eye through this pupil. Outside the pupil area, light does not enter the eye. Instead, it reflects back towards the camera. Thus, the camera sees the pupil as a dark area - no reflection - whereas the rest of the eye is brighter. This is "dark pupil eye tracking". If the IR light source is near the optical axis, it can reflect from the back of the eye. In this case, the pupil appears bright. This is "bright pupil eye tracking". It is like the "red eye" effect when using flash photography. Whether we use dark or bright pupil, the key point is that the pupil looks different than the rest of the eye.

The image captured by the camera is then processed to determine the location of the pupil. This allows estimating the direction of gaze from the observed eye. Processing is sometimes done on a PC, phone or other connected processor. Other vendors developed special-purpose chips that offload the processing from the main CPU. If eye tracking cameras observe both eyes, one can combine the gaze readings from both eyes. This allows estimating of the fixation point of the user in real or virtual 3D space.

There are other eye tracking approaches that are less popular. For instance, some have tried to detect movements of the eye muscles. This method provides high-speed data but is less accurate than camera-based tracking.

How often should we calculate the gaze direction? The eyes have several types of movements. Saccadic movements are fast and happen when we need to shift gaze from one area to another. Vergence shifts are small movements the help in depth perception. They aim to get the image of an object to appear on corresponding spots on both retinas. Smooth pursuit is how we move when we track a moving object. To track saccadic movements, one needs to track the eye hundreds of time per second. But, saccadic movements do not provide gaze direction. Thus, they are interesting to research applications but not to mass-market eye tracking. Vergence and smooth pursuit movements are slower. Tens of samples per second are often enough. Since Many VR applications want to have the freshest data, there is a trend to track the eyes at the VR frame rate.

Eye tracking systems need to compensate for movements of the camera relative to the eye. For instance, a head-mounted display can slide and shift relative to the eyes. One popular technique is to use reflections of the light source from the cornea. These reflections are called Purkinje reflections. They change little during eye rotation and can serve as an anchor for the algorithm. Other algorithms try to identify the corners of the eye as an anchor point.

There are other variables that an algorithm needs to compensate for. The eye is not a perfect sphere. Some people have bulging eyes and others have inset eyes. The location of the eye relative to the camera is not constant between users. These and other variables are often addressed during a calibration procedure. Simple calibration presents a cross on the screen at a known location and asks the user to fixate on it. By repeating this for a few locations, the algorithm calibrates the tracker to a user.

Beyond the algorithm, the optical system of the tracker presents extra challenges. It aims to be lightweight. It tries to avoid needs constraints on the optics used to present the actual VR/AR image to the user. It needs to work with a wide range of facial structures. For a discussion on optical configurations for eye tracking, please see here.

Eye trackers used to be expensive. This was not the result of expensive components, but rather of a limited market. When only researchers bought eye trackers, companies charged more to cover their R&D expenses. As eye trackers move into mainstream, eye trackers will become inexpensive.

Monday, May 22, 2017

Understanding Relative Illumination

Relative illumination in the context of optical design is the phenomena of image roll-off (e.g. reduction) towards the edge of an eyepiece. This manifests in an image that is brighter at the center of eyepiece relative to the edge of the eyepiece.

Relative illumination is usually shown in a graph such as the one below

This particular graph is from an eyepiece with 60-degree horizontal field of view designed by Sensics. The graph shows how the illumination changes from the center of the lens, e.g. 0, to the edge of the lens, e.g. 30 degrees. The Y axis shows the relative illumination where the center illumination is defined as "1". In this particular eyepiece, the illumination at the edge is just over 70% of the illumination at the center.

This effect can also be viewed in simulations. The first image below shows a simulated image through this eyepiece when ignoring the impact of relative illumination:

Simulated image while ignoring the effect of relative illumination

The second image shows the impact of relative illumination which can be seen at the edges

Simulated image with relative illumination
Relative illumination is perfectly normal and to be expected. It exists in practically every eyepiece and every sensor. It is often the result of vignetting - some light rays coming from the display through the eyepiece to the eye that are blocked by some mechanical feature of the eyepiece. This can be an internal mechanical structure or simply the edge of a particular lens. Light rays from the edge of the display are easier to block and thus typically suffer more vignetting.

When we look at an optical design, we look to see that the relative illumination graph is monotonic, e.g. always decreasing. A non-monotonic curve (e.g. a sudden increase followed by a decrease) would manifest itself as a bright ring in the image, and this is usually not desired.

Tuesday, May 2, 2017

A Visit to the IMAX VR Center

A few weeks ago, I was in Los Angeles and decided to visit the newly-opened IMAX VR center. I went there as a regular paying customer - not some "behind the scene tour" - to see and learn. I've experienced Zero Latency, The Void and many others, so could not resist trying IMAX.

The lobby of the attraction is reminiscent of a small movie theater lobby. Vertically-oriented monitors on the walls announce the available VR experiences. A small reception area sells $10 tickets for the attractions. A display shows the available time slots for each 10-minute experience. After purchasing the tickets, a friend and I were asked to wait for our scheduled time. When the time came, an attendant escorted us to the VR area.

If I remember correctly, there were eight available experiences. Seven of them were based on the HTC VIVE. One - the John Wick Chronicles - was showing on Starbreeze headset. The HTC VIVE experiences did not appear to to be specially-made for this venue. For instance, one experience was Trials on Tatooine which can be freely downloaded from the Steam store. I think people come to movie theaters for an experience that they can't get at home. One would expect VR to be the same.

I have an HTC Vive at home (as well as many other headsets) at home. Using them is part of my job. However, most folks don't have easy access to PC-based VR equipment. For now, stock experiences might be just fine to get people exposed to VR.

Inside to the VR area, Each headset was in a space separated by low walls, a bit like an open space in an office. Headset cables were tied from the ceiling. HTC VIVE units had a leather face mask which is probably easier to clean. An operator administered each experience - one operator per headset. . Operators were friendly and enthusiastic about the VR equipment. I think their enthusiasm was contagious, which was nice.

Speaking of contagious, the operators told me that they wipe the face masks between users. Masks also get replaced every couple of weeks. I was told that visitors did not often complain about wearing a VR goggle that was used by many people before them.

I couldn't help but wonder about the economics. 15-minute timeslots: 10 minutes of usage plus some time to get people in and out of the experience. $40 an hour per station. One full-time operator per station. Now add rent, equipment, content fees, ticket sales, credit card fees, etc. Can you make money? Maybe making money is not the goal in this first location. Instead, the goal could be to have a "concept store" towards inclusion at the lobby of regular movie theaters.

Since I don't have a Starbreeze headset at home, I opted for the John Wick experience. It's a shooter game that encourages you to move in a space while holding a weapon. As expected, Virtual soldiers try to kill you. The headset was fairly light and the weapon comfortable. The experience was immersive though both the image and graphics quality could have been better. I can see why a person with little VR experience could enjoy these 10 minutes.

My friend did not have many VR experiences before this visit. He chose 'Trials on Tatooine" which he seemed to thoroughly like.

In all - a nice start to what can be the next big thing in entertainment.

Have you tried IMAX VR too? What did you think?

Sunday, April 9, 2017

The Unique Requirements of Public VR

A good treadmill for home use costs around $1000. A treadmill for use in a health club could be ten times more expensive. Why would club owners agree to pay so much more? Because they understand that the home equipment would not withstand the heavy use in a club.

The same is true for VR goggles. Goggles for home use are not suitable for sustained use in arcades and amusement parks. Both VR vendors and attraction operators need to understand and address these differences.

Durability is one key issue. A goggle in an amusement park is subject to both accidental and intentional abuse. A kid might try to see if a lens can pop out. Someone else might wish to take a component as a souvenir.

Hygiene is also important. VR experiences can be intense, and parks can be in hot and humid areas. Whether it is sweat, suntan lotion or something else, a guest does not wish to wear a soaked or dirty goggle. This is true not only for the face mask, but for any built-in headphones.

Sweat and humidity might also cause fogging on the lenses. If a guest has to take off the goggles to defog them, the break in VR immersion degrades the experience.

Depending on the attraction, visitors can be of all ages. Good physical fit is important, regardless if the visitor is a small kit or a large man. Goggles must accommodate a wide range of head sizes, and eye separation. Eyeglasses also present a design challenge. Visitors prefer to keep them on, so goggle vendors try to make room for them.

Beyond the customer-facing experience, there are important operational considerations. Attractions make money by providing a unique experience to a large number of visitors. Every minute spent by a guest adjusting straps in a roller-coaster, is a minute lost. Some have reported a 30% decrease in guest throughput after upgrading a roller coaster to VR. A larger park crew assisting guests or maintaining headsets means larger operational costs.

There are several different types of public VR experiences, each with unique challenge. A free-roam experience (e.g. Zero Latency) needs to address backpack PCs and controllers. Themed experiences such as The Void have accessories that supplant the story. These accessories have many of the same challenges. A small attraction in a shopping mall cannot afford a large operating crew. A VR roller coaster might need a chin strap to keep the goggles on the head. If a VR roller coaster relies on standard phones, they might overheat or need to recharge often.

Often, the first instinct of those building VR attractions is to do everything. They might try to build their own goggles, create content, or even a tracking system. Over time, they focus on their core competencies, bringing external vendors for everything else.

The first generation of these solutions show the immense promise of public VR. VR Coaster, for instance, has deployed GearVR-based roller coaster experiences in over 20 parks. These use a special tracking system to determine the position of each car at any time. The Void and Zero Latency use backpack PCs to allow guests to explore unique spaces. Talon Simulations provides flight and driving simulations in malls. IMAX opened a VR center where guests can try a variety of 10-minute VR experiences. Most of these first-generation solutions use consumer-grade hardware. Operators realize that many problems still need solutions. At the same time, consumers learn what they like and dislike in current solutions.

HMD vendors are also rising to the challenge. ImmersiON-VRelia has developed phone-based goggles that feature a reinforced plastic construction. Sensics released goggles with detachable face mask to address both hygiene and operational efficiency.

What's missing? Low-latency wireless video solutions to get rid of cables. Faster ways to adjust goggles for guest. Better methods to clean goggles between guests. Phones the don't overheat. Multi-user experiences with a stronger social aspect. The passage of time to see what works and what doesn't. VR standards to help integrate new devices into compelling experiences.

I am excited what public VR experiences could be. My excitement comes both from a user standpoint - these experiences are fun! - but also from a problem-solver view - the problems are challenging.

Try these experiences next time you can. Going to a movie theater provides a different experience than watching at home. Going through a good public VR experience is beyond what VR at home provides.

Monday, April 3, 2017

Suffering, Art and VR Standards

Think about a great work of art: a classic book, a timeless painting, a symphonic masterpiece. What's common to many of these creations?

They were all the result of great suffering.

Tolstoy, Van Gogh, Mozart - they did not have easy lives. Many of the greats suffered from oppression, mental or physical illness, or hunger.

If you don't have drama in your life, how could you summon drama for your art?

People ask me "what made you want to work on VR standards?" My answer: it's the suffering.

No, not my personal suffering. I'm no Amaedus or have never considered cutting off my earlobe to express love.

But in many years of working with customers on their VR systems, I saw a lot of technical suffering:

The suffering of integrators that need to chase the latest API again and again. That don't know if the equipment they design for today will be available to buy in a year.

The suffering of device manufacturers that need just one more driver to support them.

The suffering of end-users that wonder if today's software will work on tomorrow's devices.

That's why we need efforts like OSVR or OpenXR to make it easy for everyone to work together. It wouldn't be as timeless or profound as "War and Peace", but it will help a lot of people.

Saturday, April 1, 2017

Unholy Alliance

A few weeks ago, we were approached by a VR porn site seeking a partnership.

It is no secret that adult entertainment is an early adopter of many new technologies and virtual reality is no exception.

We respectfully declined as we decided a long time ago that we won't participate in this market.

Besides, I'm no longer as good looking as I used to be.