The 12kX proportedly displays 6k x 2
And is advertised as:
“Reality 12K” = 5,760 x 3,240 = 37.324 Megapixels
8K = 7680 x 4320 = 33.17 Megapixels so let’s use 8k as an example for the sake of argument. But if we come within 10% we’ll come back to this.
per: bandwidth calculation:
8K, chroma 4:4:4 at 60hz = 64 Gbps.
8K, chroma 4:4:4 at 120hz =128 Gbps.
8K, chroma 4:4:4 at 200hz = 210 Gbps. (Purported display capability of Pimax Reality12K)
Per WiGig Whitepaper:
WiGig maxes out 7GBps (beam formed signal, no compression), 10GBps compressed.
Now you could say: What if we don’t do chroma 4:4:4 (text would be blurry, but I don’t care) and keep 60 hz input (then do some sort frame interpolation and rotation on the headset itself - on board motion smoothing) keep in mind this hasn’t been done before, but we love Pimax, and they are putting snapdragon chips in the HMD, so…
-EDIT - it appears Pimax may be doing exactly this with “Split rendering” but only with the wireless? According to Kevin Henderson they are not using spilt rendering, but “hybrid rendering”.
8K chroma 4:2:0 at 60hz = 32.08 Gbps
Aha! Then you say: What about DSC Compression? If we can do it for video, why not games?
8K chroma 4:2:0 at 60hz = 20Gbps with DSC, right?
Except DSC Compression is precisely what I suspect causes blanking in the original version of the 8KX at 90hz as the data rates are not continuous anymore. I believe it also made cable extensions fail. Furthermore, you cannot multiply compression algorithms. As a quick example you can’t compress a .bmp to .jpg and the put it in a .zip file and expect the product of the two compressions as a final result. The second compression offers diminishing returns. (None if the algorithm is the same). So DSC lowers our bandwidth but reduces the compression WiGig itself can achieve. Moreover: It also introduces a latency that grows with resolution. If that latency exceeds 4.5 ms at 60hz, it results in noticeable lag which may be unacceptable in VR.
EDIT: See reply by SSJ3 below about Foveated Compression Codec
Finally: what about WiGig 2.0 or 802.11ay?
Ah yes, reportedly it will achieve 20–40 Gbit/s with MIMO. But…
802.11ay has to have an absolutely line of sight transmission and whether you can do that with a dongle on a headset is questionable. To achieve MIMO I believe you would have to have say three transceivers around the room (like base-stations) and the room would have to be small or closed to allow beam forming as well. Certainly this sounds as difficult as a wire (if not more difficult) and still will not put us in the bandwidth we really want.
DisplayPort 1.4a can deliver 32.4Gbps
DisplayPort 2.0 can deliver 74 Gbps
All you have to do is look at the cables on the 12k as they connect to the headset. I think that there’s two. Literally current DisplayPort bandwidth wasn’t enough for Pimax’s vision. To tie into current GPUs we need to stick with DP1.4, which offers 32Gbps each. DP2.0 would be nice, but then you can’t use any current GPU. (There’s some debate whether 30xxs could be made to do DP2.0 via the vendors, but not current ones). I’m not sure about the two DisplayPort cables but mixing them back together on a PC software side can present another challenge for Pimax. (The first 4k monitors used 2 display ports to get the bandwidth they needed.)
EDIT: 2xDP1.4 present in the promo on the website.
Thus I hang my head wishing “12K” VR could be wireless, but alas:
Wireless games seem forever doomed to be limited by the GPU inside the headset (which will perform badly or be hot and heavy) or be tied via a cable (or two).
See PIMAX 12K QLED - THIS IS PIMAX GONE WILD! Everything You Need To Know! Incl. Pimax COO Interview!!](PIMAX 12K QLED - THIS IS PIMAX GONE WILD! Everything You Need To Know! Incl. Pimax COO Interview!! - YouTube)