Spotlight on: Tobii (DFR) Spotlight. Will it work in all games?

Can Pimax/Tobii grab this bull by the horns and make it work for all games? If so, how?

Will it work well enough that we won’t notice the DFR visually?

What performance increase do you expect?

1 Like

Some thoughts I had include:

Historically, Pimax has struggled to get games that support its niche concepts.Case-in-point MSFS2020. WideFOV support or “Parallel Projections Off” isn’t possible despite Pimax reaching out on multiple occasions.

Or for example, FFR in Pitool fails to work in most games. In each case the promise was increased performance, but lack of support made full use of this feature spotty.

Open-source (non-Pimax) efforts to make Eye-tracked DFR work rely on games exposing certain information such as “which eye” is being rendered etc., combining that info with the eye-track, then mapping the VRS (Nvidias Variable Rate Shading, part of VRworks) profile into the renderer prior to rendering. Tobii also suggests that their method will blur the outer edges to reduce the noticibility of this effect.

I believe so. I have been using FFR on the 8KX pretty well exclusively, and while I was initially skeptical, I was surprised at how well it works both in terms of performance increase and lack of harm to visual quality and immersion. And that’s with it centered on just a fixed position. With eye tracking to dynamically update it, I believe that it will be difficult to tell in most cases just by visual quality whether or not the feature is even enabled. And the improvement in performance will be steepened because it can be much more aggressive.

On the 8KX with FFR, I measured a performance increase in a heavy VRChat world that went from 46fps (no FFR) to 73fps (with FFR). And that improvement is typical of my other tests as well.

My testing also showed that the performance effect of foveated rendering increases substantially the higher in resolution and FOV you go. It’s very basic and non-aggressive in my main test scenario with the 8KX, and yet is already producing substantial gains. In the context of the 12K, my numbers suggest it can be expected to have quite a large effect, possibly increasing FPS by a factor of 2-3X.

No, they can’t. These companies simply don’t have the clout in the industry to cause mass adoption of this technology to become a high priority for game developers.

But Sony does.

The PSVR2 will be reliant on DFR for the same reasons as the 12K. Because of this, regardless of whether or not you are personally interested in the PSVR2, you should be very interested in its release because of the secondary effects it’s going to have on game development across the VR industry.

I think Pimax intends to ride this wave. Since developers will already be supporting DFR for the PSVR2, it will be much easier for them to also support Pimax’s DFR while they’re at it. And, in fact, it may become unnecessary for developers to specifically support Pimax DFR as it eventually turns into a general feature that’s generically supported by game engines.

FFR already works in most of the games that I play now. Pimax already has significant penetration for the feature. However, there’s also a lot of games that don’t support it. I think that gets a little overrepresented because there are some very high profile games for flight simmers (such as MSFS) which don’t even support canted displays without parallel reprojection, much less foveated rendering.


From what Pimax said, at least a while back in earlier phases of testing, it was an almost 40% average performance gain across 14? Games using 3080s as test machines. I 100% believe those numbers because of experience with FFR on the 8k X. That would, assumably, be kind of using the current jury rigged implementation via things like VRS that Pitool(And the VRPerf toolkit) does right now, as I recall something like Skyrim being mentioned and that’s definitely not a game built with gaze tracking in mind specifically.

A more optimized implementation of it, things like UE5 having plugins specifically to support gaze based rendering, could be even more. And the industry is definitely moving forward to that.

Hell, with conservative use of it via the VRPerf Toolkit, you can barely notice it even in recordings unless you are seriously looking for it, while still getting a good 15-20% boost. I’m sure they’ll be able to make DFR not be that noticeable. It wasn’t very much so to me when I briefly tried the ETing with it. The super large FoV is actually a serious benefit to DFR on something like that, because the foveated rendering has room to be more granular and transition more smoothly between the circles, while also lowering the quality even more in larger areas. Compared to HMDs where you can barely use your eyes to look around from small FoV.

I do hope Pimax will give us some more control over the Foveated Rendering intensity, like the VRPerfkit does, because that’d be a real help to content creators to make it a little more granular so it doesn’t obliterate recording quality.

1 Like

Tbh it will depend on what is required to work. For example pimax’s FFR relies iirc Nvidia methods where if not mistaken games need to support VRS(sp?) Now with both Intel and Amd implementing a form/version if VRS may help. But I believe these require ig support.

Hence why FFR and DFR are hit and miss depending on title.

That being said with the 12kR using a better established company like Tobii and utilizing Qualcomm as partners there is a much higher chance to potentially have drastically better results in performance and support even if a game does not have Native support.

We just need to wait and see how well these fair once released.

1 Like

This is in part due to relying on Nvidia RTX features in games that also need to have ig support for best results. Some game engines just don’t play well with others.

It also depends on where the game is more bound to how well it will work. Cpu bound, gpu bound, balanced etc…

1 Like

Well in that hopefully pimax can catch up with having more OpenXR support. Where there lagging behind at present.

1 Like

This is not really possible without support included in each game.

What we do in OpenXR Toolkit and what fholger does in VRperfkit is we inspect what the game is rendering, and we try to guess whether it’s rendering the VR image that you will see on your headset, and then it has to guess whether this is going to be the left eye or the right eye.

But this process is incredibly unreliable. I said try because indeed you know how many games do not work well or do not work at all with these techniques.

These technologies are not meant to be added externally to applications. They rely on applications doing some work.


Hey @mbucchia, how’s your coding on that sweet DFR functionality for OpenXR coming along?

Also, do you know if this is something holger is looking at for OpenVR?

Love and appreciate your work massively! :heart:


It’s not only Sony: Meta’s Cambria will feature eye-tracking too, and I assume they will definitely utilize it for their stand-alone apps & games as it will allow for a substantially higher graphical fidelity compared to Quest 2 apps & games although it likely will be sharing the same Snapdragon XR2 chip.
However, this may also trigger VR devs to consider enabling DFR for their app/game, especially where they develop a Cambria stand-alone version too so they are familiar with how it generally functions. And, over time, it may increase the percentage of the eye-tracking enabled headsets used for PC VR if it is successful comparable to teh Quest 2 (although the expected, substantially higher price point is probably going to slow the adoption down).

And for any other future VR headsets, which is not aiming solely at the entry level, eye-tracking seems to be almost a lock.

But this may still mean we have to wait for 2-3 years to see it spread over the apps & games, depending on how much work it takes and how early you need to embed it in your architecture/render pipeline design.

1 Like

The interesting thing is that the Oculus sdk has several techniques of FR and has ET as well. So in theory some of the games already using FR in a FFR could possibly have an update to add DFR.

1 Like

We’ve got something that gets eye tracking from the Droloon sensor and moves the foveated region with it, but it does not align well with the actual eye gaze. The aSeeVR SDK does not have adequate documentation to really understand what the values we are getting from the sensor mean and how we should use them

Pimax should really talk to these 7invensun folks, because this SDK is extremely low quality. Even their own sample code does not work properly. I think we’re going to be able to do something but probably not for day1 release. Between not having a device and not having a workable SDK, at this point this is blind experimenting until we magically guess the correct math :frowning:


I think he has officially announced that he is not:


I’m not actually using Pimax HMDs anymore and have switched to a Varjo VR-3.

Any news on that front in terms of DFR implementation?

Shame about holger but it is what it is.

Shame ?! :joy: Send him 5000 EUR, I’m sure he’ll take care of it. :joy: :rofl:

All the code for Varjo and Omnicept support is ready, we’re now polishing other stuff for the upcoming release.

1 Like

Yes, shame, as in ‚that’s a shame‘ which as an idiom, would translate into the equivalent of ‚ach, schade‘.

Don’t be yet another literal G-English dimwit.

The extra difficulty here is that (especially for fholger) the vendors aren’t using a standard way to expose eye tracking. So you have to implement support for each headset separately.

Varjo and HTC Vive support it through OpenXR which in our case helped (because very well documented), but without a device it was very difficult to finish up the math (same problem I have with Pimax so far). In the end, I resorted to code the math on HoloLens 2(!) which also supports OpenXR, then backport the code, and it worked on Varjo!

G2 Omnicept has their own SDK and I have to say they have the best developer experience to date because of one simple thing: they have a software simulator. So you can implement support without a headset and try it with the simulator. I got this done in just a few hours and once I got my G2 Omnicept, it worked right off the bat.

Pimax has their own SDK too (the 7invensun aSeeVR one), and out of them all, it is the least good development experience (polite phrasing :slight_smile:). No real documentation, sample code has comments in Mandarin only, and does not even work on a real device. The majority of their sensor data returns 0. The one that does return some actual eye tracking data isn’t documented (coordinate system? Z projection distance?). So it’s definitely challenging but we’ve beaten that before. Of course canted displays is upping the difficulty even more :smiley: :smiley: :smiley:

If anybody knows of someone who’s used the aSeeVR successfully for gaze tracking I’d be happy to get in touch with them!


Sounds like it is just in part an issue not having hardware to utilize. Considering the work he has invested for free. Quite understandable.

ahh ok … chars chars