Pimax 8k and 5K+ reviews are up. And it's looking good!

Wow, those illustrations are so misguided and fallacious it's hard to know where to start. The FOV of a game's camera is not related to the resolution of the display device. They are entirely independent quantities, as you can tell from any game that has an FOV slider. The resolution has the same advantages (and disadvantages) in both cases, ie increasing the detail you can resolve at the expense of computing cost. But a HMD display with a 170deg fov is absolutely rendering a wider view of the scene than anyone (in their right mind) would choose to render on a regular monitor or TV.

Sigh. The illustrations weren't meant to convey FOV silly. They show what is being used from a scene to construct a 3D image vs a flat 2D image.
 
the 8k flat scene has a larger vista and provides you with more of the landscape view.
wait, I thought your argument was people getting confused thinking this headset is like viewing 8K UHD, now you suddenly use a different aspect ratio? they make 8K TVs like that?
 
also you should note, 3D TVs halved resolution as well, so those 4K 3D TV are kinda like two seperate 2K images
 
wait, I thought your argument was people getting confused thinking this headset is like viewing 8K UHD, now you suddenly use a different aspect ratio? they make 8K TVs like that?

My arguement from the start is that combining/stacking panel resolution horizontally as in 4K x 2 = 8k for a VR HMD and using that as a spec to compare it to the 8k spec of a 2D monitor as to its resolution makes no sense.

I didn't mention anything about aspect ratios.
 
No, I'm not missing the point. You just are repeating yourself over and over acting like I don't get it and that you do. The two images very much do overlap as to the what they are viewing as to the scene, the 3D scene - again just viewed from two different perspectives - and yes, they are not synthesized from the same flat 2D image. When you look at an apple held in front of you, you are saying that you are viewing two apples which is bullshit. You are viewing the same object from two slightly different perspectives, forming a single image in your brain of that apple with your brain providing the depth perception. Take away the 3D synthesis that's occurring (just for arguments sake) and look at it as a 2D image again. Yes, now you can compare that 2D image and its resolution to a monitor image if you want, but in the case of the Pimax 8k, comparing that 4k sized image is not the same thing as a 8k monitor image. They are simply not comparable in that the 8k flat scene has a larger vista and provides you with more of the landscape view. 3D images and 2D images can't be lumped into the same single resolution spec as they are so different that the comparison makes no sense.

And now you are getting horsepower all confused. 200hp is 200hp regardless of whether it is being produced by a car or a motorcycle. It is a fundamental spec that means the same thing. There is no difference as to what produces it. They both will turn that dyno and produce the same value in horsepower if both are rated the same. WTF are you on about with horsepower here? HP is not a spec about "perceived" perfomance. It a measure of the work output capacity of a machine/animal.

You've turned this around full circle here. A resolution spec like 8K is going to tell you fuck all about the 3D performance of a VR HMD. Especially if that 8K is split across two panels that are upscaling a 1440p input signal.

Your eyes, both in the real world and in a vr headset, do not see the exact same thing from 2 different perspectives. There's only 80 degrees of overlap out of the roughly 200 degrees of fov (40%) the other 120 degrees is unique to each eye. That's called peripheral vision, which your illustration ignores. Stick your hand parallel to your nose and you'll see how little of your vision is actually seen by both eyes at the same time. Your illustration shows 100 percent overlap in both eyes which doesn't even happen in the 110fov gen1 headsets, let alone a 200fov one. I don't know how else to explain it, but you obviously don't get it even though you say you do. I can make a much more correct illustration if you like, but yours is not close to correct.

Horsepower is one measure of, usually used to advertise, the performance of a vehicle, just like pixels are one measure of a display device. In both instances you need the context for it to be at all meaningful. Substitute number of cylinders in an engine if you like. Also meaningless by itself, though you wouldn't complain about a motorcycle company advertising a 4 cylinder bike because it's not comparable to a 4 cylinder car, would you? As long as you know it's a motorcycle or a car, you can put it in context, just like knowing it's a vr headset and not a tv lets you put it in proper context.
 
Your eyes, both in the real world and in a vr headset, do not see the exact same thing from 2 different perspectives. Yup, never said they saw the exact same thing. There's only 80 degrees of overlap out of the roughly 200 degrees of fov (40%) the other 120 degrees is unique to each eye. That's called peripheral vision, which your illustration ignores. My illustration was meant to convey that the two Pimax 4K panels were not presenting a side by side view of a given scene - but rather a composite to provide the 3D depth. I was never illustrating perceived FOV or trying to convey any exact percentages. Stick your hand parallel to your nose and you'll see how little of your vision is actually seen by both eyes at the same time. This only works for close in objects. Your illustration shows 100 percent overlap in both eyes which doesn't even happen in the 110fov gen1 headsets, let alone a 200fov one. No, my illustration shows that there is overlap as to the perceived scene - the panels aren't taking in a left side only and right side only image where the scene being viewed is completely separate - i.e there are elements in the scene (objects: trees. mountains, sky, etc.) that appear in both panels. I don't know how else to explain it, but you obviously don't get it even though you say you do. I can make a much more correct illustration if you like, but yours is not close to correct. Please do - my illustration was just to show that the panels are used to synthesize a 3D image by compositing a scene. Also, with the scene being far off in this example, a landscape, the amount of uniqueness or difference between the actual image information/data isn't very dramtic here, but very small.

Horsepower is one measure of, usually used to advertise, the performance of a vehicle, just like pixels are one measure of a display device. In both instances you need the context for it to be at all meaningful. Substitute number of cylinders in an engine if you like. Also meaningless by itself, though you wouldn't complain about a motorcycle company advertising a 4 cylinder bike because it's not comparable to a 4 cylinder car, would you? As long as you know it's a motorcycle or a car, you can put it in context, just like knowing it's a vr headset and not a tv lets you put it in proper context. You are the one bringing more and more into this horsepower analogy like type of vehicle, number of cylinders, importance of context, etc. This completely refutes your stance that a single spec of horizontal resolution is plenty and all that is needed when marketing a VR HMD. You are trying to have it both ways here...
 
your stance that a single spec of horizontal resolution is plenty and all that is needed when marketing a VR HMD.

That is not, and never has been my stance. My stance is that it's fine as a name and not even slightly confusing but, just like a tv, pixels aren't the only thing that matters.
 
Last edited:
Sigh. The illustrations weren't meant to convey FOV silly. They show what is being used from a scene to construct a 3D image vs a flat 2D image.

No, you quite clearly said:

They are simply not comparable in that the 8k flat scene has a larger vista and provides you with more of the landscape view

That is false, because "more of the landscape view" means larger fov, which, as I said, has nothing to do with anything at all.

I vaguely get the point you're trying to make, but I don't think you really understand it yourself. What's true is that 2x displays used in a HMD won't resolve as much detail for a given fov as a 2d, rectangular display with the same number of pixels. That's both because, in rendering the eyes separately, each eye has less resolution to work with (but they provide additional depth information which isn't available on a monitor) and also because the HMDs warp the image into a fisheye projection, which wastes some of the rectangular image space, which you don't even mention.

But there is no "you see more of the landscape on a monitor", that's just gibberish. A wide-fov headset will typically show you more of the scene, not less, but at fewer pixels per degree of arc, than a flat screen at the same resolution. If you've ever set an FPS to run at some extra-high fov on a regular monitor you'll know how stretched and unusable it quickly becomes. You can't even do 180deg with a single, flat projection, it's not mathematically possible.

But ultimately it makes no sense to even compare a headset with a monitor, and within the context of each the various resolutions make sense.
 
Last edited:
You guys still going at it? Even if you are all 100% right, you are still wrong. It's their product and they can call it whatever they like.
 
No, you quite clearly said:



That is false, because "more of the landscape view" means larger fov, which, as I said, has nothing to do with anything at all.

I vaguely get the point you're trying to make, but I don't think you really understand it yourself. What's true is that 2x displays used in a HMD won't resolve as much detail for a given fov as a 2d, rectangular display with the same number of pixels. That's both because, in rendering the eyes separately, each eye has less resolution to work with (but they provide additional depth information which isn't available on a monitor) and also because the HMDs warp the image into a fisheye projection, which wastes some of the rectangular image space, which you don't even mention.

But there is no "you see more of the landscape on a monitor", that's just gibberish. A wide-fov headset will typically show you more of the scene, not less, but at fewer pixels per degree of arc, than a flat screen at the same resolution. If you've ever set an FPS to run at some extra-high fov on a regular monitor you'll know how stretched and unusable it quickly becomes. You can't even do 180deg with a single, flat projection, it's not mathematically possible.

But ultimately it makes no sense to even compare a headset with a monitor, and within the context of each the various resolutions make sense.

Your FOV is always the same outside of a VR HMD. Unless you are wearing blinders, you are confused. How much of a 2D scene is visible to you as part of viewing a 2D image on a monitor in front of you should not be confused with your FOV.

At least we agree that a VR HMD and a 2D monitor are very different. :)
 
I didn't mention anything about aspect ratios.
Ah, sorry, I thought you made this graphic, which looks like 32:9... no worries either way, I understand what you meant 1000 posts ago...

160372_vr2.png
 
Ah, sorry, I thought you made this graphic, which looks like 32:9... no worries either way, I understand what you meant 1000 posts ago...

Yep, made it only to convey horizontal resolution. Not aspect ratio, FOV, horsepower, etc. Feel free to redraw the vertical to whatever you want. And for Bobzdar's and Litfod's benefit, the red and the green squares represented two independent 4K panels notionally absorbing the 3D scene (not this actual 2D image) from two slightly offset positions and angles. The entire image representing the capture and display of the 3D scene as a 2D 8K wide picture. Still waiting for them to draw better pictures explaining how this is so "misguided and fallacious"... Yes, those two 4K panels in combination with optics may wrap shit around you and provide a FOV, but the data they are doing it with comes from inside those green and red squares, and the pixels outside of those boxes do not exist in the HMD.
 
Last edited:
^^^^ Errr... nope. Try goggling VR 3D Images. You will see that the 3D stereoscopic images used to feed the left and right eyes are MUCH more overlapped than that... more like what I presented in my illustration. I simplified things by not showing the lens/optics involved.
Try running Steam's free VR_Performance_Test. It shows you the two images being rendered in real time. They HEAVILY overlap as to the pictured scene each image is being generated from. There is only a very small offset:

ss_b23981725f228c341957ad55d15f4c023e3cf6c1.1920x1080.jpg
 
^ that's true for vive/rift/etc... but in most cases pimax receives higher fov images, like this

...or am I wrong?
 
^^^ If that's truly what the 8K Pimax is doing/using, then there's only going to be "3D" sweet spot for your brain to use in a vary narrow center region of the view where the image overlap actually occurs. Definitely not ideal unless you always look straight ahead and/or turn your head to look around instead of looking left or right with your eyes.
 
yep, the "sweet spot" is small compared to real life, but better in PiMax vs rift/vive. To do proper convergence you'll need eye-tracking and something like what Oculus is doing with veri-focal lense tech, but I think we're a long ways away from something you might consider "ideal"... most people report using your head more to look around becomes second nature after a few minutes though...
 
^^^ If that's truly what the 8K Pimax is doing/using, then there's only going to be "3D" sweet spot for your brain to use in a vary narrow center region of the view where the image overlap actually occurs. Definitely not ideal unless you always look straight ahead and/or turn your head to look around instead of looking left or right with your eyes.

Um, that's how your eyes actually work irl. You have a 200fov, only 80 of which is overlap. The other 60 degrees on each side is unique to each eye, as I tried to explain earlier, and why your drawing is wrong. In real life, each of your eyes can see roughly 140 degrees, 80 of which overlap form each. The other 60 on each side (for 200 total) is unique, just like the above pimax. It's only the narrow 100fov headsets that are mostly overlap (just 10 deg on each side not overlap). Most of the extra fov on the pimax is peripheral, non-overlap area to mimic you real life fov.
 
Um, that's how your eyes actually work irl. You have a 200fov, only 80 of which is overlap. The other 60 degrees on each side is unique to each eye, as I tried to explain earlier, and why your drawing is wrong. In real life, each of your eyes can see roughly 140 degrees, 80 of which overlap form each. The other 60 on each side (for 200 total) is unique, just like the above pimax. It's only the narrow 100fov headsets that are mostly overlap (just 10 deg on each side not overlap). Most of the extra fov on the pimax is peripheral, non-overlap area to mimic you real life fov.

Your numbers are wrong Bobzdar. Human vision has a 210 degree horizontal FOV and of that, 114 degrees (not 80) provides the overlap for binocular vision, which is responsible for our depth perception. But I agree, that in the case of the Vive or Rift, the image overlap is quite high (which accounts for why depth perception experienced in them is pretty much across the majority of their 100-110 degree horizontal FOV.)

What I drew in my illustration was based on what I know of the Rift/Vive and approximately how much image overlap they use, so in the case of the Pimax, which again isn't a released product yet, it appears that their latest demo hardware uses an overlap that is much smaller than what I depicted... IF the FOV selected in their software is set to the highest FOV setting (170 degrees).

The Pimax seems to be configurable in it's software as to what FOV you want: Low FOV (120 degrees), Normal FOV (150 degrees) and Large FOV (170 degrees). So, the amount of image overlap used most likely varies depending on which of these FOV setting you choose. Which FOV mode you choose also most likely results in a trade-off in how much binocular vision (depth perception) you get up front from each. I.E. It is probably maximized in the Low FOV setting and minimized with Large FOV setting. So, my illustration is probably a bit more accurate when using the Pimax's Low FOV setting, where that overlap would be high, but not quite as accurate when the Pimax is set to its Large FOV mode in which case the overlap used is probably much smaller, providing less binocular vision but maximizing the FOV. See Bobzdar, I admitted an error! :)

So the Pimax probably only offers really good depth perception (comparable to the Rift/Vive) when using low or normal FOV, but it must sacrifice some of this to deliver the Large FOV setting. This is all conjecture of course and we'll have to wait and see what the final Pimax product actually offers and how it performs. Interestingly, most of the reviewers opted to use the "Normal" 150 degree FOV mode setting for testing as it provided the least image distortion. The 170 setting may still be pushing things a bit as to added image distortion and a smaller 3D sweet spot window out front.
 
It's a range of numbers, based on interpupillary distance and some other factors including the size of your nose =D)

Check out StarVR if you haven't. According to reviews there is no distortion around the periphery. I'd love to see a side-by-side comparison, but surely it costing upwards of $10k whatever they do was just not acceptable with PiMax and their target price point.

The other reason people choose 150 instead of 170 is to have better performance, as you are rendering less pixels. Maybe with a future foviated rendering option and tweaked profile distortions they could improve on this a lot.
 
It's a range of numbers, based on interpupillary distance and some other factors including the size of your nose =D)

Check out StarVR if you haven't. According to reviews there is no distortion around the periphery. I'd love to see a side-by-side comparison, but surely it costing upwards of $10k whatever they do was just not acceptable with PiMax and their target price point.

The other reason people choose 150 instead of 170 is to have better performance, as you are rendering less pixels. Maybe with a future foviated rendering option and tweaked profile distortions they could improve on this a lot.

Yep, the range is actually 200-220, so 210 degrees is the normally accepted average as to a single number.

I really hope that one of the major players brings full eye tracking with a foveated rendering solution coupled with 4K OLED panels to a Gen 2 VR product. The performance would be amazing compared to what we have now - and I bet a single 2080Ti could then provide the needed GPU power.

Hell, my max spend limit for a Gen 2 VR product that could deliver the above would be $1200. (Hoping it comes in at something a bit more reasonable like $800) Looking back, I've already easily sunk $2K into just my Rift/Vive setups as an early adopter.) The StarVR @ $10K isn't even on my radar. Way too rich for my blood even if it did deliver the goods.
 
Your numbers are wrong Bobzdar. Human vision has a 210 degree horizontal FOV and of that, 114 degrees (not 80) provides the overlap for binocular vision, which is responsible for our depth perception. But I agree, that in the case of the Vive or Rift, the image overlap is quite high (which accounts for why depth perception experienced in them is pretty much across the majority of their 100-110 degree horizontal FOV.)

What I drew in my illustration was based on what I know of the Rift/Vive and approximately how much image overlap they use, so in the case of the Pimax, which again isn't a released product yet, it appears that their latest demo hardware uses an overlap that is much smaller than what I depicted... IF the FOV selected in their software is set to the highest FOV setting (170 degrees).

The Pimax seems to be configurable in it's software as to what FOV you want: Low FOV (120 degrees), Normal FOV (150 degrees) and Large FOV (170 degrees). So, the amount of image overlap used most likely varies depending on which of these FOV setting you choose. Which FOV mode you choose also most likely results in a trade-off in how much binocular vision (depth perception) you get up front from each. I.E. It is probably maximized in the Low FOV setting and minimized with Large FOV setting. So, my illustration is probably a bit more accurate when using the Pimax's Low FOV setting, where that overlap would be high, but not quite as accurate when the Pimax is set to its Large FOV mode in which case the overlap used is probably much smaller, providing less binocular vision but maximizing the FOV. See Bobzdar, I admitted an error! :)

So the Pimax probably only offers really good depth perception (comparable to the Rift/Vive) when using low or normal FOV, but it must sacrifice some of this to deliver the Large FOV setting. This is all conjecture of course and we'll have to wait and see what the final Pimax product actually offers and how it performs. Interestingly, most of the reviewers opted to use the "Normal" 150 degree FOV mode setting for testing as it provided the least image distortion. The 170 setting may still be pushing things a bit as to added image distortion and a smaller 3D sweet spot window out front.

The overlap stays the same, it's the outer edges that are reduced, mostly to conserve rendering performance and reduce distortions at the outer edges, which is due to the screens being angled and offset wrt to the center of the eye, though those distortions have and are being fixed via software distortion per reviewers. They would have to physically move the screens and optics to change the stereo overlap, it's a fixed value (unless they dont render the center of the screen, but that would be a waste of pixels and purely for performance).

You still don't even understand how it works and thought the images completely overlapped a day ago and you're telling me I'm wrong?
 
The overlap stays the same, it's the outer edges that are reduced, mostly to conserve rendering performance and reduce distortions at the outer edges, which is due to the screens being angled and offset wrt to the center of the eye, though those distortions have and are being fixed via software distortion per reviewers. They would have to physically move the screens and optics to change the stereo overlap, it's a fixed value (unless they dont render the center of the screen, but that would be a waste of pixels and purely for performance).

You still don't even understand how it works and thought the images completely overlapped a day ago and you're telling me I'm wrong?

Hey now Bobzdar, I manned-up and admitted that what I illustrated was based more on what I knew from the Rift/Vive. In both of those HMDs, that overlap as to displayed image is very real and quite high. I also admitted that the overlap in the case of the Pimax - which isn't a released product yet, is much less, so until it actually releases as a final product, arguing about exactly how much that image overlap actually is... well, it's kinda unproductive to say the least.

What I said is that your numbers are wrong. Specifically, you stated:

"Um, that's how your eyes actually work irl. You have a 200fov, only 80 of which is overlap
."

Go look it up and just man up to that fact. IRL, human vision has a horizontal overlap of 114 degrees, not 80. Also, horizontal FOV typically varies from 200 to 220, with 210 degrees being the accepted avg, not 200.

You obviously like a good arguement, but repetition and not admitting mistakes makes this unproductive. I'm thinking we both want better VR and are looking forward to what Pimax will actually deliver as a product as well as what some of the other players will be bringing to the table as Gen 2 VR.

My drawings were simple illustration to attempt to convey that 3D VR and a 2D image are two very different animals. If you really feel I still don't get something, feel free to draw a picture to explain it to me like I am 5. :)
 
Yup, for a given set of individuals it will vary, but if you are selecting a single number to represent that extent and not a range, then that number is 114 degrees and not 80... Or are you implying that Bobzdar has very deep set eyes and a large nose? :)
 
Last edited:
I thought you might be saying 114 was an average... really just trying to learn more myself here :) I'm not sure where the 200/80 degrees numbers come from, maybe this graphic:

Pimax8kFoV.png


which is actually a guess as to how PiMax might work and the author notes IRL we may see more or less than 80 degrees... but of course the couple reviewers of the product do notice the dark area around the nose.

you guys both may be correct though and 80-114 is a fine range to use... or at least, I hope so so I can see you both hug it out
 
This may've also contributed to the confusion... clearly the graphic makes the overlap look like 80 degrees...

Untitled.png
 
also, just do the test with yourself. hold out both hands and close one eye, move the opposite hand so it just touches the extent of that vision. repeat for the second eye. open both and you'll see it's definitely less than 114 (at least for me, mine being less than 90, maybe even 70)
 
Hey now Bobzdar, I manned-up and admitted that what I illustrated was based more on what I knew from the Rift/Vive. In both of those HMDs, that overlap as to displayed image is very real and quite high. I also admitted that the overlap in the case of the Pimax - which isn't a released product yet, is much less, so until it actually releases as a final product, arguing about exactly how much that image overlap actually is... well, it's kinda unproductive to say the least.

What I said is that your numbers are wrong. Specifically, you stated:

"Um, that's how your eyes actually work irl. You have a 200fov, only 80 of which is overlap
."

Go look it up and just man up to that fact. IRL, human vision has a horizontal overlap of 114 degrees, not 80. Also, horizontal FOV typically varies from 200 to 220, with 210 degrees being the accepted avg, not 200.

You obviously like a good arguement, but repetition and not admitting mistakes makes this unproductive. I'm thinking we both want better VR and are looking forward to what Pimax will actually deliver as a product as well as what some of the other players will be bringing to the table as Gen 2 VR.

My drawings were simple illustration to attempt to convey that 3D VR and a 2D image are two very different animals. If you really feel I still don't get something, feel free to draw a picture to explain it to me like I am 5. :)


OK, the exact number is kind of unimportant, the point I was, and still am, trying to make is that the overlap is nowhere near 100%, it's more like half that, both in real life and the Pimax. And the reduced fov mode leaves the overlap the same and just reduces the outer peripheral rendering, so instead of 30-50 degrees on each outer edge that has no stereo overlap, there's 20-40 degrees. The overlap in the center is still 80 (or 100 or 114 or whatever it is) and because your brain has enough frame of reference from the other eye, increasing the fov doesn't reduce the stereo effect at all, just like stuff in your peripheral vision in real life doesn't look 2d even though it isn't in view of both eyes.

I don't remember exactly where I got the 80 from, it may have been from some vr documentation I read in the past, but the point is the pimax mimics real life fov much more closely than the rift/vive by using the much wider 8k pixels in non-16:9 format. The reason the rift/vive use the much reduced fov is because the reduced number of pixels spread over a 200 fov would waste a bunch of vertical pixels while having MASSIVE screen door. So using an 8k uhd screen would actually be counter productive - increasing cost for a bunch of pixels that wouldn't be visible vertically at 200 horizontal fov. However, putting 8k horizontal pixels in the pimax allows the wide fov while also decreasing screen door vs. the other sets. The 5k will be more similar in screen door to current sets as it has less pixels, but reviews peg it as slightly better than the vive pro/samsung odyssey and better than the rift/vive, while the 8k is a much larger leap in reduced screen door, though a little less sharp due to non-native rendering. The sharpness issue actually seems to be remedied by increasing super sampling, but that will require a 2080ti (or more) to make usable.

That said, I am not sold on the pimax, my whole argument was that 8k isn't misleading if you know anything about vr and bother to read the specs. Especially given that the best of the other headsets only have 3k horizontal pixels, the 8k is pretty much 2.5x better when you factor in both the much wider fov and reduced screen door. The reason it not being 16:9 doesn't matter is all of those added pixels would be in the vertical range, which is already close to human fov in the 1st gen sets and the pimax 5k/8k, so they'd mostly be outside of view and wasted, both in display cost and in added rendering cost. I think pimax has the display setup very close to ideal - it mimics human fov without wasting rendering pixels that aren't in view and uses available 4k screens to do it and provides enough pixels to make screen door much less prevalent. I'm not sold because the software has to be there, the optics have to be there and the performance has to be there. It sounds like, from reviews, they're still working on that stuff but it's 'good'. Not great, but good. I don't know if I'd but if it's not great, because the Rift (and Vive) are, so I wouldn't trade off more pixels for dropped frames and distracting distortion.
 
my whole argument was that 8k isn't misleading if you know anything about vr and bother to read the specs

Let me state that I agree completely here. My argument was from a general marketing stance targeting less-informed users, which frankly are who the industry should be targeting in order to grow. I'm pretty well informed in general and less informed when it comes to VR specifically, and I had to look into this.

Put simply, confusion doesn't help market growth here :).

I also want to thank you for spelling out your perspective. I don't have a real opinion with respect to a lot of it as I find VR to be too expensive for what is still largely a market dominated by 'beta' experiences, but I'm definitely up for learning and I've learned from your contributions!
 
Bobzdar, it's funny how I agreed with you on the overlap bit, but you are still trying to school me.

I never said that the Rift/Vive were modeled after real life binocular vision overlap and admitted that the illustration really better represented what was going on with the Rift/Vive than the Pimax. Not sure why you felt the wall of text was needed or that you needed to repeat yourself all over again.

This all started because, and I quote, "my whole argument was that 8k isn't misleading if you know anything about vr and bother to read the specs."

To which I still advocate that:
1) VR as a consumer product is so new that most consumers don't have a context to use
2) Resolution numbers used in marketing TV's and monitors already use numbers like 4k and 8K to signify a 2D display experience
3) The average consumer new to VR is going to make comparisons to what they already know about TV/Monitors in the above point, which doesn't "fit" in that unlike 2D images, VR builds a 3D image using an overlap within the panels of the rendered image from the scene to provide stereopsis.

So, just saying that a number like "8K" it isn't all that great is all, where you are saying you are fine with it. Simply a difference of opinion here.

Edit: And you may feel it is unimportant when you are wrong (as to the numbers), but you sure do persist in trying to continually point out how "I don't understand this". Go back and read all of my posts - when I refer to a "scene" I am talking about a real view of the world, not a flat image of it. It's almost like you have your own selective reality and you've fallen into this rut thinking that I don't get it and keep recycling your same argument over and over to "show me the light" when in fact I already do get it and have from the beginning. The only mistake I made was in my assumption that the Pimax used a similar overlap to that of the Rift/Vive, which I have admitted to several posts ago.

Hug? Youn really wants us to. I bet he also wants to watch. :)
 
Last edited:
^ what is your personal binocular overlap (or whatever it's called) it'll take about 10 seconds to find out... if it's not around 80, only then Bobzdar's numbers would be wrong... 114 seems ungodly, I wonder if anyone truly has that much coverage or it's just some theoretical number pulled from geometry and not really true in practice...
 
Unlike overlap percentages, exact numbers are kind of unimportant to Bobzdar, Youn, I'm coming up with just a tad shy of 90 myself. (Looks like the angle vertex is measured from the tip of your nose.) Yep, 114 has got to be the max/fringe case, i.e. someone with a very flat/short nose and non-deep set eyes. I bet this guy comes close.
 
Bobzdar, it's funny how I agreed with you on the overlap bit, but you are still trying to school me.

I never said that the Rift/Vive were modeled after real life binocular vision overlap and admitted that the illustration really better represented what was going on with the Rift/Vive than the Pimax. Not sure why you felt the wall of text was needed or that you needed to repeat yourself all over again.

This all started because, and I quote, "my whole argument was that 8k isn't misleading if you know anything about vr and bother to read the specs."

To which I still advocate that:
1) VR as a consumer product is so new that most consumers don't have a context to use
2) Resolution numbers used in marketing TV's and monitors already use numbers like 4k and 8K to signify a 2D display experience
3) The average consumer new to VR is going to make comparisons to what they already know about TV/Monitors in the above point, which doesn't "fit" in that unlike 2D images, VR builds a 3D image using an overlap within the panels of the rendered image from the scene to provide stereopsis.

So, just saying that a number like "8K" it isn't all that great is all, where you are saying you are fine with it. Simply a difference of opinion here.

Edit: And you may feel it is unimportant when you are wrong (as to the numbers), but you sure do persist in trying to continually point out how "I don't understand this". Go back and read all of my posts - when I refer to a "scene" I am talking about a real view of the world, not a flat image of it. It's almost like you have your own selective reality and you've fallen into this rut thinking that I don't get it and keep recycling your same argument over and over to "show me the light" when in fact I already do get it and have from the beginning. The only mistake I made was in my assumption that the Pimax used a similar overlap to that of the Rift/Vive, which I have admitted to several posts ago.

Hug? Youn really wants us to. I bet he also wants to watch. :)

Your whole argument on why 8k was misleading was based on your understanding of the overlap being 100% between the eyes, which in your illustration only gave the equivalent to the width of a 2d 4k image. That's why I harped on it and I obviously wasn't explaining it well because it still took you 2 pages of posts to realize your understanding of it was wrong.

If you try on a gen1 2.5k headset and then the pimax 8k, the difference should be pretty evident, and I agree shouldn't be compared to a tv. I don't think anyone would cross shop a tv and a vr headset, though.
 
Your whole argument on why 8k was misleading was based on your understanding of the overlap being 100% between the eyes, which in your illustration only gave the equivalent to the width of a 2d 4k image. That's why I harped on it and I obviously wasn't explaining it well because it still took you 2 pages of posts to realize your understanding of it was wrong.

If you try on a gen1 2.5k headset and then the pimax 8k, the difference should be pretty evident, and I agree shouldn't be compared to a tv. I don't think anyone would cross shop a tv and a vr headset, though.

No Bobzdar. You chose to interpret it that way and harp endlessly on the overlap. That's what has been so frustrating to me. And thanks for agreeing that it shouldn't be compared to a TV. That was the entire intent/point I was trying to make all along - that the "8K" label in the case of the Pimax (using two 4K panels for VR) is a VERY different thing than a non-VR single panel providing 8K of resolution in a traditional monitor/ TV. Hence the apples to oranges comparison when talking about using an "8K" label in describing horizontal resolution. The exact overlap percentage is irrelevant - it's the fact that there IS overlap used in VR and that the resolution comparisons between it and the TV/Monitor world (which most consumers that don't understand VR would be confused by) makes no sense. 8K in the TV/Monitor world is the next "grail" resolution as to what the market considers the very high end, top-shelf product available. (i.e. 8K TVs that start at $15K.)

Again, that's what has been so frustrating here - your harping on the percentage of overlap which was completely irrelevant to the entire point I was trying to make - that the overall horizontal resolution number is a crappy moniker that isn't comparable between VR and TVs/Monitors. I really think they should have stuck to just calling it a "true 4K" VR headset (in the case of the Pimax 8k and especially the 8Kx), but their marketing folks and their desire to be differentiate themselves as unique or better led them to call it "8K".
 
An Nvidia 8800 must be at least 4x faster than an Nvidia 2080 because numbers.

If a person doesn't research even the basic specifications of a product then it's hardly the manufacturer's fault if the consumer doesn't receive what they thought they were purchasing. In other words, there's nothing wrong with the naming of the Pimax 5K or 8K headsets.
 
If a person doesn't research even the basic specifications of a product

I expect people who actually purchase a VR headset to do the research.

The concern is the confusion that the name causes in terms of marketing. Moving less of these is bad for everyone, and confusion doesn't help!
 
Fully agree - you'd give each card it's own complete render pipeline duties, one for each eye and just use the SLI/Crossfire "link" to keep the output frames fully synced, rather than sharing workload... It's an expensive solution, but it would provide the highest quality VR possible and I'm sure there's a fairly large group of VR enthusiasts that would buy a second card for this, especially if it meant making true 4K resolution per eye possible. While Nvidia did release some driver tech to render dual viewpoints on one card, it really makes performance sense to dedicate two cards to this instead if you want the very best VR performance possible.

VR SLI exists, and it does exactly what you're stating. I imagine 2x 2080 Ti 4K@90 will probably be pretty spectacular if some dev used all VRWorks techniques to their full potential.
 
VR SLI exists, and it does exactly what you're stating. I imagine 2x 2080 Ti 4K@90 will probably be pretty spectacular if some dev used all VRWorks techniques to their full potential.

Errr, that’s a big fat fucking NOPE. It does not do exactly what I stated which requires TWO 4K @90Hz video feeds. Ya, sure SLI VR support exists for Gen 1 VR and it’s unfortunate how very few developers/games exploit or make use of it yet... But the whole reason I brought this up is that what is there right now simply WILL NOT yet work where the Pimax 8Kx needs are concerned.

From the Nvidia SLI VR documentation:

“In other words, you render one eye on each GPU, and combine both images together into a single frame to send out to the headset.”

Well, while this sounds great, it falls completely fucking apart when you actually examine it a bit closer, which is why I raised this up as an issue. The bit about “combining both images together into a single frame” is where it goes off the rails. SLI VR only supports a single video cable. Bottom line, it isn’t there yet unless they decide to dedicate unique pipelines AND TWO dedicated signal paths which is not what SLI VR does yet.

For Gen 1 VR, sure, it can provide a benefit, however, in the case of the Pimax 8Kx, which requires TWO separate 4K 90 Hz video feeds, there simply isn’t currently a single cable standard/solution that can support the required bandwidth of “combining” TWO 4K 90Hz signals over a SINGLE video cable coming out of ONE card that this SLI VR support can provide. While the new “VirtualLink” just came out on the RTX cards, it appears that if it will not have the requisite video bandwidth needed. (It seems to be more a blending of the combination of video and USB/power than just expanding out the video bandwidth alone.).

There really hasn’t been much info or detail released about it yet:
https://sites.google.com/view/virtuallink-consortium/home

VirtualLink appears capable of delivering 4K @ 120Hz, but not 2 combined feeds of 4K @90Hz. That’s why I brought this up. They need to alter VR SLI support such that each card can feed each eye in a dedicated fashion - one left and the other right via TWO separate video cables for the HMD. The currently provided SLI VR support simply doesn’t do this yet. Can they get there? Sure, but it will require some further work/development and the Pimax 8Kx is the only product on the near horizon that would even need this.

Fast forward a year or two when Gen 2 HMDs have dedicated 4K panels per eye and this may become a necessity to properly drive native 4K resolutions on a high end PC with a next gen HMD for VR. (That is unless they can also throw in eye tracking and foveated rendering into the mix to get the bandwidth requirements paired down significantly so as to not need 2 native 4K signal paths.)
 
Last edited:
Back
Top