Virtual reality headsets are primed to be the next big technology that changes the way the world works. But they are still in the early stages of development, and many new features are being added on a regular basis.
One of these features is the ability to track the eyes of its user, allowing the device to know exactly where they are looking. This has many benefits, like increasing the level of detail near the center of the visual field, and lowering it in the periphery, where it’s not as noticeable.
But a less obvious and more impressive use of eye tracking is being implemented by Eyefluence, the developers of software that can actually tell when its users are getting bored.
By monitoring the movements of the eye, and following the user’s gaze during interactive VR moments, a uses level of engagement can be measured. This helps content creators know what parts of their software is interesting to those that use it, and which parts are falling flat.
Developing VR software is a challenge in many ways, as it is so vastly different than the current forms of media that so many people are familiar with. It can be compared to the early years of film, when people were first learning how to tell stories using a visual medium. Some elements of radio broadcasting were helpful, as were those from stage theater.
This can be seen in the way many early movies used only one or two stationary cameras, and were acted on sets similar to that of a play. Compared to today’s movies and television, were the action can move dynamically, and the method of storytelling has evolved.
But in VR, the “director” loses control over the exact framing of a scene, as the view is free to look in any direction they choose, which may or may not be towards the key events. Knowing where users are looking instead may shed insight into what catches people’s eyes, allowing developers to better hold their attention when and where they want it.