MRTV: StarVR One Costs $3200 / OpenVR Compatible

MRTV: StarVR One Costs $3200 / OpenVR Compatible

Dear VR community,

this is Sebastian from MRTV. I had a call with the StarVR One headquarter this morning and I got some interesting news about official pricing and compatibility/availability.

Official price will not be changed from what they wanted for the device when they first attempted to launch it in 2019:

USA: 3200 USD excl. tax, incl. shipping

Europe: 2800€ excl. tax, incl. shipping

I also asked about compatibility and learned that the company had been working on the StarVR One drivers to make the headset OPENVR COMPATIBLE. Now that is truly good news because it means that most of the existing SteamVR software will run on the device using it’s extreme FOV.

I had also asked if that would mean that all SteamVR games would be automatically compatible with the headset. The answer was that StarVR cannot guarantee that and depending on the game, the developer might need to implement some finetuning to make it all work on the StarVR One.

The interesting part is that the StarVR drivers manage to translate the 2 viewport rendering of existing Open VR content to the 4 viewports that the StarVR would actually expect. Now how that works in detail I could not find out from that call, but for sure we will learn more about that in the future.

Anyways, since the device is not meant for consumers but for enterprise customers, this means that their current content either already runs well on the device or only needs little changes to be fully compatible with the StarVR One.

As far as availability is concerned, the company was flooded with requests from enterprise customers and endconsumers alike. StarVR told me that they will not be able to cater to individual customers, who would want to purchase the device right now, but that it is strictly meant for enterprise.

Enterprise customers can complete the order form and in order to save time and make the process more efficient, the company asks the potential customers to be as specific as possible with their requests. What engine should be used for programming, what accessories are being used and so on. Like this, the company would know if the StarVR One could actually already serve customers and their projects right now.

You could watch my whole video about it here:

Sincerely, Sebastian


Honestly I couldn’t see why the VR headset driver couldn’t cut the image into 2 per eye; than apply distortion profiles and restitch image. Just more as you said with camera angles and object culling due to extreme FoV.


MRTV: I want this headset!

StarVR One is only available to enterprise customers.

MRTV: * makes MRTV Business channel *
MRTV: Please send me a review headset!


I’m quite disappointed. Obviously they’re not locking out gamers because they hate us, it’s because they know that not a lot of games will work well. That’s also why they want to know EXACTLY which software you’re going to run with it, so they can see if it makes sense for you to buy the headset.

So, in short, this is really not interesting for us right now.


Haha, damn, it was too obvious, wasn’t it? LOL!


Their task to foster broad compatibility with that type of solution will be herculean. We tried it and even simple patches broke games daily. Our engineers finally went in a different direction. The amount of manpwer required to stay on top of that would be mind boggling.


i want one for sure… rgb oled is the way to go!

If StarVR can implement quadport rendering for SteamVR titles designed for 2 viewports, Pimax should be able to do this too. I noted this a long time ago. Its probably accomplished using the Z buffer to simulate two extra viewports, or they inject an extra set of cameras and then warp it.

Here is a post from May 2018 where I suggested sofware agnostic quadport rendering

"@mmorselli So, my new rendering idea.

You know Vorpx?

So, it has a fake SBS mode, where is uses the Z buffer from 1 camera to generate a second image for a SBS image.

My thought was, you can use the standard left and right camera views, but using the Z buffer from the left eye, and right eye, you would generate 4 virtual viewports.

This way, you can eliminate distortion without per application programing, and without eye tracking.

If you wanted to get clever, you could have the left eye render at 1 FOV (say 90) and the right eye at another FOV (say 150,) and then using Z buffer from the L and R images generate 4 viewports that were corrected for their area of the lens

The other thing is to have a monoscopic rendering option."


Pimax should be able to do it as they don’t need extra camera views with only a 160° Wide vs StarVR with 220° Wide.

Everything is still in front of you on pimax. Dividing into 2 views per eye can be used to further optimize lens distortion by having more control over regional distortion instead of treating it as one big area.


yes, that’s why they can use the depth buffer, It would be like single pass stereo, just with two more view ports.

1 Like

When I think about it, I begin to suspect there is not enough software magnification in the distortion correction out toward the far edges for this 8kX. Achieving such magnification might require rendering a FOV in which everything is in fact not in front.


The pimax outer FoV has no stereo overlap so to likely improve picture needs more SS to compensate.

Pimax has not invested the time in developers necessary to actually make use of what really needs to happen, IE native quadport rendering, and I doubt the plague is making things any easier. I think there must be a way to enable quadport rendering in software, as Star VR is now doing. Pimax is already relatively affordable lol at least it was before this whole mess went down, but every little bit helps

I am at a complete loss as to what would be gained by doing that.

The lens distortion compensation benefits in no way that I can envisage, and splitting two views into four, and then back again, would only constitute an unnecessary extra step, bringing its own generation of image degradation.

The benefit of using four viewplanes, for render optimisation purposes (performance and rendering distortions, both due to the oblique projection), lies entirely in doing that rendering to more segments in the first place, as far as I can tell.

What Pimax could conceivably do (…although I haven’t fully thought this through (just occurred to me), and would probably get a headache if I tried), would be to render with a symmetrical frustum when in “large” FOV - I.e: With the camera facing 20° out to the side, with as many degrees of FOV to the left as to the righ, for each eye, instead of the usual 10°, and then reproject to the native 10° canting – just like with parallel projections, but in the opposite direction. -This should potentially yield a 10-ish percent reduction in shading load, from getting rid of that long stretching in the last 10° of the periphery (you need to render as many degrees more (still relative to the camera) “across the nose”, as you took away in the periphery, but that is nothing next to the previous “side-heavy overshoot”).

…or even flip the asymmetric frustum around, so that the camera faces even farther out; That should actually begin to counter the stretching effect that made us consider quad-or-more-view rendering in the first place, and begin to render more detail in front of you that to the side, so to speak - both in terms of width and height, so that one could conceivably reduce the render target size a bit.

The big caveat, other than how loosely considered a thought this is, regards any distortions the rendering algorithms may produce in the camera’s periphery (perhaps from floating point precision issues, perhaps from screen-space shaders expecting a certain orientation, perhaps something else), that is at that point being used for the view straight ahead, instead of the viewer’s periphery… :stuck_out_tongue:

(By the way… I know a joke suffers if one have to explain it, but am I missing some great pun with the: “DeVs” (…or suchlike) random dip into the upper and lower type cases :7)

(PS… Wow, must have been logged out ten times whilst typing this - good thing the forums autosaves the edit box contents. :7)


Did they agree to send you a review headset?

Mentioned in his prior video that they achieve a dynamic distortion adjustment through eye tracking. So they are able to implement either multiple cycling distortion profiles or they recalculate distortion on the fly. Whereas Pimax is stuck with an all around solution for the time being due to lens tech limitations and lack of eye tracking at this time. Plus something you have to remember is that StarVR screens are closer to the end user’s face as they use a stacked multi-lens approach this was discussed on MeantobeSeen forums during their Infiniteye development stage and touched upon briefly in interviews with both Seb and VoodooDe. This particular method also helps limit distortion as more of the view in within the sweet spot, which means you do not have to do as much image manipulation to get it to look believable to the eye. I am not familiar with the lenses in the Pimax but its clear that due to screen-lens distance and lack of pupil tracking they are behind in this area. I would be curios if they went with AdHawk or Tobii or perhaps even 7invensun who have previously developed the VIVE eye tracking add on, if they could then achieve similar perceived distortion reduction to StarVR

1 Like

I am trying since 2 years, and I am truly a persistent guy. So far to no avail, but I am still on it. I tried to purchase but they would not let me.


They might fear your honest reviews. :beers::smirk::+1::sparkles:

Wow I’m a bit surprised to hear this. You appear to have a fairly intimate relationship with StarVR/Acer, having been to their offices several times and having had several lengthy phone conversations with them…and yet they’re reluctant to give you a headset to test/review? That seems pretty messed up and unfair.

1 Like

But if you were them, would you do it? I’m not sure what they have to gain. They know MRTV has a base of primarily gamers. Most likely game support is currently very bad. They don’t need a video out there that confirms that, it can only cause harm.